San Francisco Sues AI ‘Undressing’ Websites Over Non-Consensual Nude Images

TLDR

  • San Francisco’s City Attorney is suing 16 websites that use AI to create non-consensual nude images of women and girls
  • The sites have been visited 200 million times in the first half of 2024
  • Some sites allow users to create pornographic images of children
  • Victims face difficulties removing these AI-generated images once shared online
  • The lawsuit seeks $2,500 per violation and aims to shut down the sites

San Francisco City Attorney David Chiu has filed a lawsuit against 16 websites that use artificial intelligence (AI) to create non-consensual nude images of women and girls.

The lawsuit, filed in San Francisco Superior Court, targets sites that allow users to “undress” or “nudify” people in photos without their consent.

According to Chiu’s office, these websites have received over 200 million visits in just the first six months of 2024. The lawsuit claims the site owners include individuals and companies from Los Angeles, New Mexico, the United Kingdom, and Estonia.

The AI models used by these sites are reportedly trained on pornographic images and child sexual abuse material. Users can upload a picture of their target, and the AI generates a realistic, pornographic version. While some sites claim to limit their service to adults only, others allow images of children to be created as well.

“This investigation has taken us to the darkest corners of the internet, and I am absolutely horrified for the women and girls who have had to endure this exploitation,” Chiu stated.

He emphasized that while AI has “enormous promise,” criminals are exploiting the technology for abusive purposes.

The lawsuit alleges that these AI-generated images are “virtually indistinguishable” from real photographs. They have been used to “extort, bully, threaten, and humiliate women and girls,” many of whom have no way to control or remove the fake images once they’ve been created and shared online.

In one troubling incident highlighted by Chiu’s office, AI-generated nude images of 16 eighth-grade students were shared among students at a California middle school in February. The students targeted were typically 13 to 14 years old.

The legal action seeks to have these sites pay $2,500 for each violation and cease operations. It also calls for domain name registrars, web hosts, and payment processors to stop providing services to companies that create these AI-generated deepfakes.

The rapid spread of what experts call non-consensual intimate imagery (NCII) has prompted efforts by governments and organizations worldwide to address the issue. The use of AI to generate child sexual abuse material (CSAM) is particularly concerning, as it complicates efforts to identify and protect real victims.

The Internet Watch Foundation, which tracks online child exploitation, has warned that known pedophile groups are already embracing this technology. There are fears that AI-generated CSAM could overwhelm the internet, making it harder to find and remove genuine abuse material.

In response to these growing concerns, some jurisdictions are taking legislative action. For example, a Louisiana state law specifically banning AI-created CSAM went into effect this month.

Major tech companies have pledged to prioritize child safety as they develop AI technologies. However, researchers at Stanford University have found that some AI-generated CSAM has already made its way into datasets used to train AI models, highlighting the complexity of the problem.

Source: https://blockonomi.com/san-francisco-sues-ai-undressing-websites-over-non-consensual-nude-images/

You May Also Like

More From Author