California’s governor signs bills to protect children from AI deepfake nude photos

SACRAMENTO, Calif. (AP) — California Governor Gavin Newsom signed a pair of proposals Sunday aimed at protecting minors from…

SACRAMENTO, Calif. (AP) — California Gov. Gavin Newsom on Sunday signed a series of proposals aimed at protecting minors from the increasingly common misuse of artificial intelligence tools to generate harmful sexual images of children.

The measures are part of California’s concerted efforts to strengthen regulations around the massive market, which increasingly impacts Americans’ daily lives but has had little to no oversight in the United States.

Earlier this month, Newsom also signed some of the strongest laws to crack down on deepfakes in elections, although the laws are being challenged in court. California is wildly seen as a potential leader in regulating the AI ​​industry in the US

The new laws, which received overwhelming bipartisan support, close a legal loophole surrounding AI-generated child sexual abuse images and make clear that child pornography is illegal even if it is AI-generated.

Current law does not allow prosecutors to go after people who possess or distribute AI-generated images of child sexual abuse if they cannot prove the materials depict a real person, supporters said. Under the new laws, such a crime would qualify as a misdemeanor.

“Child sexual abuse material should be illegal to create, possess and distribute in California, regardless of whether the images are AI-generated or of real children,” said Democratic Assemblymember Marc Berman, who is one of the bills wrote, in a statement. “AI used to create these horrible images is trained on thousands of images of real children being abused, revictimizing those children.”

Newsom also signed two other bills earlier this month to strengthen revenge porn laws, aiming to protect more women, teenage girls and others from sexual exploitation and harassment powered by AI tools. It will now be illegal for adults to create or share AI-generated sexually explicit deepfakes of a person without their consent under state law. Social media platforms are also required to provide users with the ability to report such material for removal.

But some laws don’t go far enough, said Los Angeles District Attorney George Gascón, whose office sponsored some of the proposals. Gascón said new penalties for sharing AI-generated revenge porn should have included young people under the age of 18. The measure was narrowed by state lawmakers last month to apply only to adults.

“There have to be consequences, you don’t get a free pass because you are under 18,” Gascón said recently in an interview.

The laws come after San Francisco filed a first-in-the-nation lawsuit against more than a dozen websites that uploaded AI tools promising to “undress any photo” in seconds.

The problem with deepfakes isn’t new, but experts say it’s getting worse as the technology to produce them becomes more accessible and easier to use. Researchers have raised alarms over the past two years about the explosion of AI-generated child sexual abuse material using images of real victims or virtual characters.

In March, a Beverly Hills school district expelled five high school students for creating and sharing fake nudes of their classmates.

The issue has led to swift bipartisan action in nearly thirty states to address the proliferation of AI-generated sexual abuse material. Some of them offer protection for everyone, while others only ban material depicting minors.

Newsom has touted California as an early adopter and regulator of AI technology, saying the state could soon deploy generative AI tools to tackle traffic congestion and provide tax advice, even as his administration considers new rules against AI discrimination in hiring practices.

Copyright © 2024 Associated Press. All rights reserved. This material may not be published, broadcast, written or redistributed.

You May Also Like

More From Author