Deepfake AI ‘Undress’ Porn Sites Sued in California Court

The city of San Francisco on Thursday filed a massive lawsuit against 18 websites and apps that generate unauthorized deepfake nude photos of unsuspecting victims.

The complaint, published with the names of the plaintiffs’ services redacted, takes aim at the “proliferation of websites and apps that offer to ‘undress’ or ‘nude’ women and girls.” It alleges that the sites were collectively visited more than 200 million times in the first six months of 2024.

“This investigation has taken us into the darkest corners of the internet, and I am absolutely appalled for the women and girls who have endured this exploitation,” San Francisco City Attorney David Chiu said in announcing the lawsuit. “Generative AI holds tremendous promise, but as with all new technologies, there are unintended consequences and criminals who seek to exploit the new technology.

“This is not innovation, this is sexual abuse,” Chiu added.

While celebrities like Taylor Swift are often the target of this kind of image-building, he pointed to recent cases that made headlines involving high school students in California.

“These images, which are virtually indistinguishable from real photographs, are being used to extort, harass, threaten and humiliate women and girls,” the city’s announcement said.

The rapid spread of what is known as non-consensual intimate images (NCII) has led to efforts by governments and organizations around the world to curb the practice.

“Victims are left with little to no recourse as they face significant obstacles to removing these images once they have been distributed,” the complaint reads. “They are left with severe psychological, emotional, economic, and reputational harm, and without control and autonomy over their bodies and images.”

Even more problematic, Chiu notes, is that some sites “allow users to create child pornography.”

The use of AI to generate child sexual abuse material (CSAM) is particularly problematic because it severely hampers efforts to identify and protect real victims. The Internet Watch Foundation, which is tracking the issue, said that known pedophile rings have already embraced the technology and that AI-generated CSAM could “overwhelm” the internet.

This month, a Louisiana law went into effect that specifically prohibits child sexual abuse using AI.

While major tech companies have pledged to prioritize child safety in AI development, such images are already included in AI datasets, according to researchers at Stanford University.

The lawsuit demands that the services pay $2,500 for each violation and cease operations. It also asks domain name registrars, web hosts and payment processors to stop providing services to organizations that create deepfakes.

General intelligent newsletter

A weekly AI journey told by Gen, a generative AI model.

Source: https://decrypt.co/245025/deepfake-ai-undress-porn-sites-sued-in-california-court

You May Also Like

More From Author