San Francisco law firm files lawsuit to shut down AI-powered deepfake nudity websites

bridge-4443760_1280.jpg

San Francisco City Attorney David Chiu has announced that his office is litigating 16 websites that are using AI to create and distribute non-consensual deepfake nude images of women and girls, amid increased scrutiny over the creation and distribution of non-consensual AI images.

The lawsuit, the first of its kind in San Francisco, accuses the website operators of violating state and federal laws against deepfake pornography, child pornography and revenge pornography, as well as California’s unfair competition law.

Chiu wants to raise the alarm about these abuses

According to the New York Times, the initiative was the brainchild of Deputy Chief Attorney Yvonne Mere, who urged her colleagues to file a lawsuit that would result in the closure of 16 websites.

The names of the websites were redacted from the copy of the lawsuit made public on Thursday.

While the Attorney General’s Office has not yet identified most of the website owners, officials in that office are optimistic about finding the names of the site owners and holding them accountable.

At a press conference Thursday, Chiu said the sites produce “pornographic” material without the consent of the people in the photos.

Chiu has indicated that the lawsuit is intended to shut down the websites and to raise the alarm about and put an end to this form of “sexual abuse.”

“This investigation has taken us into the darkest corners of the internet, and I am absolutely appalled for the women and girls who have endured this exploitation.”

Say.

On the aforementioned websites, users upload photos of fully clothed, real people, and AI then alters the photos to resemble what the person would look like naked.

As noted in the lawsuit, one of the sites promotes the non-consensual nature of the images, claiming, “Imagine wasting time taking her out on dates when you could just use (removed website name) to get her nude photos.”

The availability and accessibility of open-source AI models means that anyone can access AI-powered engines and modify them for their purposes. This results in the creation of sites and apps that can generate deepfake nudes from scratch or ‘nudify’ existing photos in realistic ways, often for a fee.

San Francisco isn’t the only place facing this challenge

In January, deepfake apps made headlines when fake nude photos of Taylor Swift went viral online. Many other, much less popular people were prosecuted before and after Swift.

Chiu has admitted that the “distribution of these images has exploited a shocking number of women and girls around the world,” from celebrities to high school students.

Through its own investigation, the city attorney’s office found that the websites were visited more than 200 million times in the first six months of this year. The office expressed concern that once an image is online, it becomes difficult for victims to determine which websites were used to “nudify” their photos.

This is because the images have no unique or identifying features that can be linked back to the websites, and it is also difficult for victims to remove the photos from the internet, which affects their self-esteem and digital footprint.

Earlier this year, five Beverly Hills eighth-graders were expelled from school after taking and sharing fake photos of sixteen eighth-graders with their faces superimposed onto AI-generated bodies.

Chiu’s office said it has observed similar incidents at other schools in California, New Jersey and Washington, where images were used to demean, bully and threaten women and girls.

According to the law firm, it has had a huge impact on the reputation, mental health and self-confidence of the victims, and in some cases they have even become suicidal.

You May Also Like

More From Author