California bans AI-generated nude photos of children

On Sunday, California Governor Gavin Newsom signed two proposals that aim to close a loophole when it comes to AI-generated nude photos of children.

AB 1831, authored by Assemblymember Marc Berman, clarified that child pornography is illegal even if it is generated by AI. A previous law did not allow legal action against those in possession of AI-generated, deepfake nude images of children if it could not be proven that those images depicted a real child.

Videos from VICE

In a Facebook post, Berman said the proposal was one of his “highest priority bills.”

“AI used to create these horrible images is trained on thousands of images of real children being abused, revictimizing those children,” he wrote.

The prevalence of AI-generated child pornography has been studied in Britain and Australia. In the first case, an Internet Watch Foundation investigation found that a dark web forum hosted more than 20,000 AI-generated images in one month. More than 10 percent of those images met legal requirements for child sexual exploitation material.

Meanwhile, the Australian Center to Counter Child Exploitation received almost 50,000 reports of child sexual exploitation material between 2023 and 2024, an increase of more than 9,000 reports from the previous year.

Newsom’s latest signing comes amid the governor’s crackdown on AI. Earlier this month, Newsom signed bills that would make it easier to identify deepfakes content, make it illegal to distribute sexually explicit images of a real person that appear authentic, and allow users to report sexually explicit deepfakes of themselves.

“No one should be threatened by someone on the internet who can deepfake them, especially in sexually explicit ways,” Newsom said. “We are in an era where digital tools such as AI have enormous potential, but can also be abused against other people. We are taking action to protect Californians.”

You May Also Like

More From Author