New Mexico sues Snapchat over alleged child abuse and security lapses

In a recent development, New Mexico Attorney General Raul Torrez announced a lawsuit against Snap, Inc., accusing the company of endangering children through its platform, Snapchat. The lawsuit alleges that Snapchat’s design features, policies and recommendation algorithms contribute to harmful activities with minors.

The lawsuit alleges that Snapchat’s ephemeral content gives users a false sense of security that can be exploited by malicious actors. The platform’s recommendation algorithm is accused of connecting inappropriate accounts to minors, increasing the risk of harmful interactions. The platform’s inability to verify users’ identities is seen as a factor that could lead to minors accessing unsafe content.

The New Mexico DOJ’s investigation alleges that Snapchat is often used by individuals to engage in harmful activities targeting minors. As part of its investigation, the DOJ set up a decoy account for a 14-year-old who reached out to accounts with explicit referrals, and found approximately 10,000 records of abusive content linked to Snapchat on dark web sites, leading to disturbing interactions that bolster the lawsuit’s claims.

According to the official website of the New Mexico Department of Justice, Torrez said: “Our undercover investigation found that Snapchat’s malicious design features create an environment where predators can easily target children through sextortion schemes and other forms of sexual abuse,” said Attorney General Torrez. “Snap has misled users into believing that photos and videos sent on its platform will disappear, but predators can permanently capture this content and have created a virtual yearbook of sexual images of children that are traded, sold, and stored indefinitely. Through our lawsuit against Meta and Snap, the New Mexico Department of Justice will continue to hold these platforms accountable for prioritizing profit over the safety of children.”

The lawsuit also notes that the platform is a primary social media platform for sharing child sexual abuse material (CSAM). Parents report that their children share more CSAM on Snapchat than on any other platform, minors report having more online sexual interactions on the platform than on any other platform, and more sex trafficking victims are recruited on Snapchat than on any other platform.

The lawsuit accuses Snap, Inc. of negligence, putting platform engagement ahead of the safety of its users. It cites Snapchat’s widespread use among U.S. teenagers, with more than 20 million teens reportedly active on the platform.

Snapchat’s Community Guidelines explicitly state: “We prohibit all activities that involve the sexual exploitation or abuse of a minor, including sharing images of child sexual exploitation or abuse, grooming or sexual extortion (sextortion). When we identify such activities, we report all instances of child sexual exploitation to the authorities, including attempts to engage in such behavior. Never post, store, send, forward, distribute or solicit nude or sexually explicit content involving anyone under the age of 18 (this includes sending or storing such images of yourself). We prohibit the promotion, distribution or sharing of pornographic content. We also do not allow commercial activities involving pornography or sexual interactions (both online and offline). Breastfeeding and other depictions of nudity in non-sexual contexts are generally allowed.”

This lawsuit follows a similar lawsuit filed against Meta Platforms in December, which also accused the company of failing to protect children from sexual abuse and exploitation.

You May Also Like

More From Author