San Francisco sues AI websites that create nude deepfakes of women and girls

Nearly a year after AI-generated nude photos of high school girls rocked a community in southern Spain, a juvenile court this summer sentenced 15 of their classmates to one-year suspended prison sentences.

But the artificial intelligence tool used to create the harmful deepfakes is still easily accessible on the internet, promising to “dismantle” any photo uploaded to the website within seconds.

Now, a new effort is underway in California to shut down the app and others like it. This week, San Francisco filed a unique lawsuit. Experts say the case could set a precedent, but it will also face many obstacles.

“The distribution of these images has exploited a shocking number of women and girls around the world,” said David Chiu, the elected city attorney of San Francisco who brought the case against a group of high-traffic websites with ties to entities in California, New Mexico, Estonia, Serbia, the United Kingdom and elsewhere.

“These images are used to harass, humiliate and threaten women and girls,” he said in an interview with The Associated Press. “And the impact on the victims has been devastating in terms of reputation, mental health, loss of autonomy and in some cases even causing some to become suicidal.”

The lawsuit filed on behalf of California residents alleges that the services violated numerous state laws against fraudulent business practices, nonconsensual pornography and child sexual abuse. But it can be tricky to determine who controls the apps, which aren’t available in phone app stores but are still easy to find online.

Contacted by AP late last year, one service claimed by email that its “CEO is based and travels throughout the U.S.,” but declined to provide evidence or answer other questions. AP is not naming the specific apps being sued to avoid promoting them.

“There are a number of sites that we don’t know exactly who these operators are and where they’re operating from at this point, but we have investigative tools and subpoena power to investigate those,” Chiu said. “And we will certainly use our powers as we pursue this litigation.”

Many of the tools are used to create realistic fakes that take pictures of clothed adult women, including celebrities, and “ghosts” without their consent. But they have also turned up in schools around the world, from Australia to Beverly Hills, California, usually with boys taking the images of female classmates that are then circulated on social media.

In one of the first cases to receive wide publicity in Almendralejo, Spain, last September, a doctor who brought the case to the public’s attention after her daughter was one of the victims said she was satisfied with the heavy sentence handed down to their classmates following a court ruling earlier this summer.

But it is “not only the responsibility of society, of education, of parents and schools, but also the responsibility of the digital giants who profit from all this junk,” Dr. Miriam Al Adib Mendiri said in an interview on Friday.

She applauded San Francisco’s action but said more efforts are needed, including from larger companies such as California-based Meta Platforms and its subsidiary WhatsApp, which was used to distribute the images in Spain.

As schools and law enforcement agencies try to punish those who create and share deepfakes, authorities are grappling with what to do about the tools themselves.

In January, the European Union’s executive arm explained in a letter to a Spanish member of the European Parliament that the app used in Almendralejo “does not appear to fall” under the bloc’s new, far-reaching rules to improve online security because it is not a large enough platform.

Organizations that track the growth of AI-generated child pornography will be closely watching the case in San Francisco.

The lawsuit “has the potential to set a legal precedent in this area,” said Emily Slifer, policy director at Thorn, an organization that works to end child sexual exploitation.

A Stanford University researcher said it will be harder to bring them to justice because so many suspects live outside the U.S.

Chiu “has a tough case to make in this case, but could potentially get some of the sites taken offline if the defendants who operate the sites ignore the lawsuit,” said Stanford’s Riana Pfefferkorn.

She said that could happen if the city wins by default and obtains injunctions covering domain name registrars, web hosts and payment processors, “which would effectively shut down those sites even if their owners never appear in court.”

—Matt O’Brien and Haleluya Hadero, Associated Press

You May Also Like

More From Author