Snap Inc. Under Fire in New Mexico’s Unredacted Lawsuit, Reveals Need for Kids’ Safety Legislation

Willa Blake is a Campaign Associate for Issue One’s Technology Reform Team, and Isabel Sunderland is a youth advocate with Design It For Us and an intern with Issue One’s Technology Reform team.

Snap Inc. cofounder and CEO Evan Spiegel testifies in a Senate Judiciary hearing on online child sexual exploitation, January 31, 2024.

On September 5th, 2024, New Mexico Attorney General Raúl Torrez added to ongoing legal scrutiny of Big Tech firms by filing a lawsuit against Snap Inc., the parent company of Snapchat, alleging that the company systemically failed to protect children from sextortion, sexual exploitation, and harm. Now, an unsealed version of the complaint has been released, revealing key information about Snap’s repeated patterns of gross negligence and deceptive trade practices that have disproportionately impacted minors on the platform. Attorney General Torrez argues that the company’s design features and business models — combined with deliberate indifference from key executives — actively worked to connect minors with pedophile networks, conceal cases of child grooming, and facilitate the illegal sale of drugs and firearms.

Below are the most disturbing allegations from the unsealed lawsuit and their implications for the future of social media reform.

A Pattern of Deliberate Indifference

New Mexico’s unredacted case demonstrates an egregious pattern of apparent disregard for user safety, despite warning signs, in the name of customer retention. Snap’s internal documents show that the company has been aware of widespread sextortion, grooming, and sexual abuse on its platform but has failed to take adequate action. By late 2022, Snap employees were receiving 10,000 user reports of sextortion per month, a staggering number that the company acknowledged was only “a small fraction” of actual cases. Despite this, Snap hesitated to warn users, not wanting to alarm its user base. In fact, according to the lawsuit, the platform consciously decided not to store child sex abuse images, even though it would have aided law enforcement and helped Snap enforce its own rules. CEO Evan Spiegel suggested the idea himself in an email chain: “We don’t want to be responsible for storing that stuff. Better if they screenshot and email ghostbusters to report.”

Snap’s own research, revealed in the unredacted filing, confirms that sextortion is pervasive. One-third of teen girls and 30% of teen boys reported being exposed to unwanted contact on Snapchat in 2022. Internal surveys from Snap revealed that over half of Gen Z users or their friends had experienced catfishing, and many were victims of sextortion. Rather than confront and address these alarming realities, Snap chose to ignore user reports. Alarmingly, one internal investigation concluded that 70% of victims had not reported their abuse because they knew no action would be taken by Snap; indeed, of the 30% that did report, none were addressed.

Sextortion, however, was hardly the extent of Snap’s harm. According to the lawsuit, the company conducted a review of its platform and discovered that millions of drug and gun dealers had previously gone undetected. In fact, based on estimates from the study, there was “an average of about half a million unique users being exposed to drug-related content every day.” An internal, undated Snapchat presentation further noted that dealers had been using Snapchat’s design features like Quick Add, Mentions, and Snap Map “to reach teens on Snapchat they would never encounter in real life,” and that “some teens have even died as a result of buying drugs that they found through Snapchat.” The presentation also revealed that there were 50 posts related to illegal gun sales per day and 9,000 views per day of these marketed weapons. The same presentation acknowledged that “(m)ost bad content is not reported on Snapchat,” and that even “(r)eported content is usually viewed hundreds of times before report.”

Additionally, Snap’s internal communications reveal that the company has known for years about the risks its platform poses to young users. Despite public assurances that children under 13 are not permitted to use Snapchat, emails in 2022 between Snap employees state, “I don’t think we can say that we actually verify” users’ ages. In other words, not only can any child who knows how to type a fake birthdate access to the platform, but Snap’s executives knew about it and allowed it to happen.

Despite repeated warnings from Snap employees about harmful design features and content, New Mexico’s lawsuit exposed that top executives consistently ignored the concerns. In turn, the willful neglect from leadership exposed a pattern of broken promises.

Broken by Design

In addition to rolling back the curtain on Snap’s apparent indifference, the unredacted complaint further highlights how key design features on the platform enable and even encourage sextortion, drug and gun trafficking, and harm to kids’ mental health.

Ephemeral Design and Illicit Content:

The core design feature and selling point of Snapchat is users’ ability to send disappearing photographs and messages to each other. The ephemeral nature of the content has fueled a burgeoning gun and drug trafficking network on the platform. The unredacted lawsuit reveals that Snap knew entirely about the danger and implications that disappearing content poses. Snap’s communications director, for example, complained internally that while the company was “pushing back fiercely on the claim that illegal content is particularly bad on Snapchat … from what we can see, drug dealing—both buying and selling—has increased significantly.” Likewise, one employee noted that it takes “under a minute to use Snapchat… to purchase illegal and harmful substances.”

Quick Add:

Snap knew that its Quick Add feature, which recommends friends for users to connect with on the platform, has directly facilitated predatory behavior. The lawsuit revealed that regardless of whether a minor explicitly allowed Snapchat to access their contacts, they could still receive Quick Add suggestions from strangers once they had added one or two friends. For example, if a minor added only a few friends that were above the age of 18, Snap’s algorithm would immediately begin recommending more adults to connect with, including ones that the user would not know. An investigation launched by New Mexico’s Department of Justice further affirms that users did not have to explicitly add accounts over the age of 18 to still see recommendations for adults. Snap was aware of the flaws in this feature, and internal reports reflected that bad actors would exploit the design feature by using gaming platforms to find underaged users, add them to Snapchat, and “jumpstart the algorithm to suggest additional minor friends.”

Mentions:

A related feature called “Mentions” allows users to tag their friends in group chats, stories, and snaps. However, this feature also allowed friends of the original poster to click on the mention and view the tagged account, even if they weren’t connected with that account themselves. In practice, this feature connected pedophiles with minors, a defect that Snap was aware of. An internal investigation of one bad actor found that 6.68% of all the underage victims that he “friended” on the platform were added through the Mention feature. This feature allowed the predator to see posts and stories from users they were already friends with. If one of those friends posted an image and tagged another person, the predator could see the tagged account and follow that person as well. However, when presented with a potential solution for both Quick Add and Mention, Snap leadership complained that identifying and protecting minors from predators would “create disproportionate admin costs” and, therefore, would not be justifiable. In fact, when one employee flagged the issue, another questioned, “How does proactively playing D(efense)” against potential pedophiles “help us unlock more growth?”

YOLO:

In a section of the lawsuit that was previously fully redacted, New Mexico’s lawsuit details Snap’s relationship with the anonymous messaging platform YOLO. Launched in May 2019, the app was integrated into Snapchat and allowed users to send and receive anonymous messages from their friends on both platforms. In June 2020, 16-year-old Carson Bride “took his own life after enduring months of anonymous bullying via the YOLO app on Snapchat.” One day after his mother, Kristin Bride, filed a federal lawsuit against Snap, YOLO, and LMK (another anonymous messaging app), Snapchat severed its relationship with YOLO. However, Snap executives knew long before that YOLO and its design features were inherently harmful. (Note: Kristin Bride is a member of Issue One’s Council for Responsible Social Media. Learn more about her son, Carson, and her ongoing efforts to hold social media platforms accountable in this op-ed she wrote for USA Today).

The unredacted portions further describe how an unnamed teen “initially downloaded YOLO to fit in but quickly observed widespread bullying on the platform.” She shared a survey that she conducted that demonstrated that among 81 students at her school, 71 reported experiencing bullying through YOLO. Countless more complaints were filed by users, including ones that described how they were being “severely bullied and harassed,” receiving “abusive messages on YOLO as anonymous messages, and how “someone (was) spreading (their) number on people’s YOLO,” even though they didn’t have social media.

Snap Map:

The unsealed complaint emphasizes how the “Snap Map” feature has enabled predators to connect with minors. The map shows the live location of users who opt-in, and it also allows users to upload photos that appear at the location where they were taken. Depending on privacy settings, strangers can view content posted to Snap Map and potentially add the account that shared the photo. Snap acknowledged that posts to the map “could generate ‘Friend Requests’ from illegitimate friends (people who the account holder did not know and may not have wanted to be connected with).” In turn, the feature allows predators to stay in touch with users who post to the Snap Map regardless of their consent.

For example, in 2019, Steven Anthony Spoon, a 34-year-old man, befriended teen girls by posing as a girl on the app. He later found where they lived through their Snap Map and “watched as they showered or changed.” Similarly, in 2018, Richard Edwards, 55, tracked an underage girl’s whereabouts on Snap Map before showing up unannounced at her house, “grabbed her, forcing her into a bedroom and sexually assaulting her.” After he assaulted her, he threatened to kill her if she shared it with anyone and continued to follow and molest her in the following weeks. Snap Map remains a key feature on the platform.

Even more so, the complaint revealed that Snap had internal documentation of a dark web “Sextortion handbook,” which showed the reader how to use Snap Maps to target a school by “tap(ping) on the screen to view any snap stories that might have been shared by students.” In December 2020, the United Kingdom’s Tackling Exploitation and Abuse Unit warned Snap that an app called “Hoop” allowed individuals to bypass safeguards on the app and permitted strangers to connect with users on the platform.

Snapstreaks:

The unsealed complaint adds additional color to the claim that the “Snapstreaks” feature, which encourages users to message each other daily, was known by company executives to contribute to addictive, excessive, and harmful use of the app. In 2019, an internal presentation acknowledged that “Streaks make it impossible to unplug for a day,” that they had “become pressure-filled,” and included data on users’ fear of missing out. In a written statement to Senator Blumenthal’s office, the company acknowledged that six percent of users “indicated Streaks were a significant source of stress,” and that the company had not “studied the impact Snapstreaks had on the mental health of children nor had it conducted research specifically measuring the addictiveness of its features.”

Instead, company leadership and staff celebrated Streaks for its success in hooking users. In an internal email distributed in January 2017, employees praised the design feature: “Wow, we should have more addicting features like this,” and “If we find that streaks are addictive… then it would be something positive for ‘healthy’ long term retention and engagement.” The presentation slide on smartphone habits for users aged 13-17 indicated that 53% of those users open Snapchat “first thing in the morning,” and that “Gen Z teens (13-17) stressed the importance of streaks and even integrated the practice of sending them into their daily morning and nightly rituals.”

Towards a Solution

The New Mexico vs. Snap lawsuit presents yet more evidence that these social media companies have for years designed platforms that identify, isolate, and target minors, all while directly fighting against any source of accountability. The harm the platforms have inflicted on children is staggering.

Snapchat isn’t an isolated case—Meta, TikTok, and others have also built systems that expose kids to predators, drug dealers, and exploitation, all while executives claim ignorance or, worse, actively conceal the truth. Whether it’s Snap’s algorithmically-created network of online predation, TikTok’s promotion of dangerous, at times even deadly, challenges to children, or Meta profiting from the amplification of content that encourages children to hate themselves, the pattern is clear. Platforms are willing to endanger children’s lives to protect their fiscal agenda. These companies have proven, time and again, that they cannot be trusted to prioritize the public good without oversight. The tech industry, like every other sector that directly impacts public safety, needs enforceable laws and clear guardrails. We don’t trust auto manufacturers to self-regulate, so why are tech companies still doing so?

Fortunately, legislative solutions can still prevail. For over two years, a broad coalition of civil society organizations, parents, and youth groups have coalesced around the Kids Online Safety Act (KOSA). KOSA offers a crucial step forward in protecting children from the everyday harms of the online world—including child sexual abuse material (CSAM), bullying and harassment, addiction, bullying and harassment, addiction, and sextortion. In fact, on September 10th, 2024, a poll released by Issue One and Fairplay revealed nearly universal support across the political spectrum for legislation requiring social media platforms to protect kids online. Indeed, 86% of US voters polled support the Kids Online Safety Act, and 76% say they would be more likely to vote for a US Congressional Representative supporting the legislation.

Snap has publicly endorsed KOSA. However, both the timing of the announcement and the company’s track record suggest that the move was a strategic effort meant to distract attention away from egregious harms to minors. On January 25th, 2024, one week before CEO Evan Spiegel was called to testify at a key Senate hearing on Big Tech and the Online Child Sexual Exploitation Crisis, a Snap spokesperson announced: “Protecting the privacy and safety of young people on Snapchat is a top priority, and we support the Kids Online Safety Act.” Yet, as the New Mexico lawsuit highlights, the company’s rhetorical strategy is to profess support for design-focused legislation while continually prioritizing and designing its platform in a manner that is diametrically opposed to kids’ online safety.

In fact, lobbying disclosures reveal Snap spent a record $860,000 on lobbying in 2023, besting its previous record by $180,000. Last year alone, the company employed 11 lobbyists and supported multiple trade associations to kill legislation that would hold them accountable. Collectively, with top companies like Meta, TikTok, Discord, and X, Snap spent a combined $30 million on lobbying in 2023 and employed over 142 lobbyists—one for every four members of Congress. Two trade associations, TechNet and NetChoice, additionally spent $1.5 million lobbying against bills like the Kids Online Safety Act, stifling lawsuits like the one from New Mexico, and killing state online safety legislation last year.

The Future

If KOSA had been in place when Snap first designed features like Snap Maps, Mentions, and Quick Add, the company would have had a responsibility to ensure their products were safe for kids. KOSA introduces long overdue accountability by specifically targeting the design features of social media platforms instead of moderating specific pieces of user content. Key components of the legislation would require companies like Snap to enforce strict privacy settings, require platforms to affirmatively prevent and mitigate risks to young users, and require the strongest safety settings by default. KOSA offers a bipartisan solution to Big Tech’s consistent and willful failure to prioritize child safety, providing crucial protections that platforms have long ignored. As cases continue to mount against these companies, the urgent need for stronger user protections and regulatory oversight is more apparent than ever.

For years, parents have been prepared to warn their children about the dangers of the real world. Don’t talk to strangers; walk with a buddy, and if you see something, say something. But today, parents are no longer equipped with the tools to keep their children protected. New Mexico’s lawsuit highlights the pressing truth that today, parents’ worst nightmares follow children home on their phones and computers. Social media companies have taken away children’s safe spaces. It is past time. Lawmakers must rally behind KOSA and ensure these platforms face the consequences of their negligence. The next generation is on the line.

Related Reading on Tech Policy Press:

You May Also Like

More From Author