City Attorney Takes Action Against Nonconsensual Deepfake Pornography Providers – NBC Bay Area

Concerns about artificial intelligence have been catapulted to the forefront of American discourse at a time when misinformation abounds. AI is everywhere, ranging from predictive text that predicts a typist’s next word to deliberately misleading images, videos and memes.

But there is an even more serious application of AI: so-called deepfake pornography, where sexually explicit images of real people are created without their consent.

On Thursday, the office of San Francisco City Attorney David Chiu launched what it calls the first lawsuit of its kind against websites that create and distribute nonconsensual pornography using AI. Chiu alleges that these sites not only “undress” adults, but also manipulate and create pornographic images of children.

Celebrities including Gal Gadot, Natalie Portman, Emma Watson and Scarlett Johansson have also been affected by deepfake porn images.

“The spread of non-consensual deepfake pornographic images has exploited real women and girls around the world,” Chiu said in a statement. “This investigation has taken us into the darkest corners of the internet, and I am absolutely appalled for the women and girls who have endured this exploitation.”

The lawsuit, filed in San Francisco Superior Court on behalf of the people of California, alleges violations of state and federal laws prohibiting deepfake pornography, revenge porn, and child pornography, as well as violations of California’s Unfair Competition Act.

Some of the 16 companies named in the lawsuit are based in the United Kingdom or Estonia and reach millions of people. One is based in Florida and owned by a Ukrainian businessman, others are based in Los Angeles and New Mexico. Several others are simply called “Does” because the city attorney’s office has not yet been able to identify them.

None of the companies involved in the lawsuit could be reached for comment Thursday.

Most sites allow users to upload images that they want to “undress,” regardless of whether the person doing so has permission to use the image or images. According to Chiu, this means that images of children are also being submitted. These open-source models have been modified and trained to create new versions that are highly effective at generating pornographic content, he said.

“These highly popular, sophisticated models not only generate pornographic content featuring fictional AI-generated individuals, but also manipulate images of real people to produce fictional pornographic content depicting those individuals,” the lawsuit’s complaint reads. “The models are able to recognize clothing and body features in an image of a person, and can be further conditioned to manipulate the image to generate a fake, photorealistic image that retains the person’s face but replaces their clothed body with a nude body — giving the appearance of ‘undressing’ the person and exposing their intimate body parts. These models ‘undress’ or ‘nude’ not only adults, but also children.”

Such images are not only created for the user, but can also be used to “bully, threaten or humiliate” women and girls, the complaint said.

Chiu cites AI-generated nude photos circulating at a Beverly Hills high school in February that targeted 16 eighth-graders.

“Generative AI has tremendous promise, but as with all new technologies, there are unintended consequences and criminals who try to exploit the new technology. We need to be very clear that this is not innovation — this is sexual abuse,” Chiu said.

Chiu’s lawsuit seeks to have the websites taken offline and for the defendants to be permanently banned from further participation in the alleged unlawful conduct. The lawsuit also seeks civil penalties and costs for bringing the lawsuit.

As chief prosecutor of a major city, Chiu can file lawsuits on behalf of the state, his office said.

“That’s the same thing we did when we sued the opioid industry, the fossil fuel industry, the gun manufacturers, etc.,” spokesperson Jen Kwart said in an email. “We can sue entities outside of California if they violate California law, which these companies are doing. There have been media reports of victims of this behavior in California.”

You May Also Like

More From Author