Snapchat is a ‘breeding ground’ for child molesters, New Mexico lawsuit alleges

Snapchat logo and icon displayed on smartphone. Source.

On September 5, 2024, New Mexico Attorney General Raúl Torrez filed a lawsuit against Snap Inc., alleging that the design and implementation of certain features on the social media platform make it one of the “most harmful purveyors of child sexual abuse material (“CSAM”) and harmful features on children’s electronic devices.”

The lawsuit (State of New Mexico v. Snap Inc.) alleges that Snapchat, a popular social media app that lets users exchange photos and videos, is designed to attract and addict young people; openly promotes and markets “illegal child sexual material”; and facilitates “sextortion” and the trafficking of children, drugs and weapons. This, combined with the company’s intentional misleading of the public about the security and design of its platform, amounts to a public nuisance in violation of the state’s Unfair Practices Act, according to the 164-page filing.

The alleged damages at issue in the case include:

  • Implementing design features and policies that fail to detect or apply the actual age of users
  • Preventing effective parental controls and reporting mechanisms
  • Allowing predators to identify, contact, manipulate and extort children and through these contacts develop CSAM
  • Designing algorithms and features that connect child abusers with children and enable child abusers to find victims
  • Creating a virtual market for the marketing and sale of illegal drugs and weapons to children
  • Failure to warn and positively mislead parents and children about the presence of sex trafficking, sexual exploitation, and drug and weapons sales on the platform
  • Do not report CSAM material
  • Using features such as ephemeral content, like ‘streaks’, that reward compulsive Snapchat use
  • Aggressively send notifications of new content to users

The lawsuit details the many ways in which Snapchat has allegedly become a “breeding ground” for child molesters, as revealed by an undercover investigation conducted by the New Mexico Department of Justice. The office first set up a fake account for a 14-year-old girl with the username Sexy14Heather (“Heather”), initially listing her registration age as 18 but later changing it to the account of a minor. Within a day of searching only for other 15-year-olds on the app and not adding any other users, Heather received a friend request from Enzo (Nud15Ans)who quickly requested to exchange anonymous messages outside of Snapchat via an ngl.link. After this one exchange, and despite her account being private, Snapchat suggested nearly a hundred other users to Heather, including other adults looking to exchange sexually explicit content. An additional search suggesting Heather was looking for other users under the age of 18 led to continued recommendations for explicit accounts with usernames like “naughtypics,” “gayhorny13yox,” and “teentradevirgin.” And despite never engaging with them directly, the decoy account received push notifications pointing to even more explicit content.

Figure 6 of the State of New Mexico v. Snap Inc. lawsuit.

Images of erect penises and selfies of adult men ended up in Sexy14Heather’s Snap messages when investigators began directly contacting these explicit accounts in ways that made it clear that Heather was a minor. “The ease with which accounts representing minors were located and targeted by malicious users highlights the way Snap facilitates sexual exploitation and abuse on its platforms,” ​​the filing states.

Furthermore, researchers were able to use Snapchat’s search function to access specific types of sexually explicit content. A search for genitalia and children yielded accounts promoting pedophilia. Searches for “pizza vendors” (a proxy for child pornography), “trade young,” and “trade girls” yielded dozens of users selling or soliciting child pornography for Heather. Snapchat also allowed searches for “child rape, necrophilia, bestiality, and a variety of other fetishes.”

Figure 18 and Figure 21 from the State of New Mexico v. Snap Inc. lawsuit.

Researchers have also turned to the deep web to learn more about the spread and scale of CSAM that originated on Snapchat. In a search of the non-indexed deep web, researchers have identified more than 10,000 records associated with Snap and CSAM over the past year. This included information about minors under the age of 13 who had been sexually abused.

The complaint goes on to say that Snapchat is the app of choice for criminals in financial “sextortion” schemes, in which predators solicit explicit images from users and then extort money from senders to prevent the images from being distributed to friends and family — a practice that has led countless children to commit suicide. Not only is Snap the most widely used app to coerce victims into sending these sexually explicit images, according to the filing, but it also allows attackers to deploy “widely distributed” scripts that “easily bypass Snap’s inadequate security measures.” Snap has failed to proactively detect the use of these scripts to stop known cybercriminals.

This evidence “represents just one example of the child sexual exploitation material authorized, developed, promoted, and disseminated by Snapchat — through which more than half of American teenagers who use Snapchat are not only exposed to viewing, but also manipulated into providing sexually explicit material or recommended or introduced to child molesters,” the complaint says. The state is seeking civil penalties of up to $5,000 for each violation and to permanently enjoin Snap and its employees from engaging in the alleged public nuisance the company has caused.

Thursday’s complaint against Snap employs the same novel legal strategy that Attorney General Torrez is using against Meta. In December 2023, his office filed a 225-page lawsuit against Meta and its CEO, Mark Zuckerberg, for allegedly failing to protect children from sexual abuse, online solicitation and human trafficking, in violation of New Mexico’s Unfair Practices Act. That lawsuit alleges that Instagram and Facebook’s recommendation algorithms helped child molesters contact underage users and exchange CSAM, amounting to a “public nuisance” by “creating and maintaining a condition that is prejudicial to the health and safety of thousands of New Mexico residents and interferes with their enjoyment of life.”

Meta has since sought to have the case dismissed, but a judge denied the motion in May, allowing the case to proceed. Attorney General Torrez characterized the decision as the end of an era in which platforms have avoided legal liability for their roles in online child abuse. “All social media platforms that harm their users should be put on notice,” he said in a press release about the ruling. (Torrez also told Tech Policy Press during a podcast appearance in August that the tech industry can expect to see similar criminal investigations continue.)

While Attorney General Torrez seeks to take the lead in holding social media companies accountable for facilitating child predation and “sextortion” on their platforms, other attorneys general’s offices across the country are also targeting the platforms for their “addictive” design features and negative impact on young users. Last October, a bipartisan coalition of 42 state attorneys general filed nearly a dozen lawsuits against Meta, alleging that the company knowingly designed and implemented harmful features on its social media platforms that intentionally addicted children and teens and, according to some of the filings, did so in violation of the Children’s Online Privacy Protection Act (COPPA). (New Mexico has not signed on to any of the lawsuits against Meta.)

There is also significant concern in Washington about the safety of children and teens online. Earlier this year, lawmakers dragged executives from Discord, Meta, Snap, TikTok and X before the Senate Judiciary Committee to discuss the “online child sexual exploitation crisis.” There, Meta’s Mark Zuckerberg and TikTok’s Shou Chew faced by far the toughest questions from senators. One particularly notable moment of the hearing came when Zuckerberg, at the urging of Sen. Josh Hawley (R-MO), turned to the row of “victim families” to offer his condolences and pledged that the platform would continue “its industry-leading efforts” to protect children online. However, only one question was reserved exclusively for Snap CEO Evan Spiegel on the topic of drug trafficking on his app. Spiegel apologized to the parents of children who accessed illegal drugs on Snapchat, some of whom have died of overdoses, for Snap’s failure to “prevent these tragedies.”

Related reading:

You May Also Like

More From Author