Bluesky struggles with moderating child abuse material in Portuguese

Receive our newsletters

It’s grace. So when you’re done, você vai curtir!

⚠️

This report addresses a sensitive topic that may be disturbing to some readers. However, it does not contain any graphic images.

Bluesky, an alternative social network to X that now has more than 10 million users, faces major challenges in moderating Portuguese content related to the sexual exploitation of children and adolescents.

A study by Nuclearin collaboration with independent researchers, identified 125 Portuguese-language profiles that share or sell illegal material, including explicit photos of child sexual abuse, without any censorship of the images.

Together with Brazilian researchers Tatiana Azevedo and Letícia Oliveira, Nucleos investigation identified profiles and posts that blatantly violated Bluesky’s terms of service and local law over a two-week period.

During the analysis, we were able to determine that Bluesky did not moderate phrases, terms or emojis commonly associated with child sexual abuse material (CSAM) in Portuguese, even if they are explicit.

In a statement to NuclearBluesky said that “our moderation team removes accounts that engage in this content, and we remove this data from our infrastructure. Over the past month, we have increased the size of our moderation team to handle the influx of new accounts.”

The social network’s moderation team has been overloaded since X was blocked in Brazil in late August, causing Bluesky to add more than 2.5 million new Brazilian users in less than a week.

On September 5, 2024, just six days after X was suspended in Brazil, Aaron Rodericks, Head of Trust & Safety at Bluesky, reported a tenfold increase in reports of this criminal content on the network compared to the previous week.

All the questions we asked Bluesky

  • Does Bluesky have different strategies for dealing with text messages that discuss child sexual abuse or child luring, and images of child exploitation?
  • If so, what are those strategies?
  • Does Bluesky have a dedicated team dedicated to moderating child exploitation? If so, how many members are in this team?
  • Does Bluesky use image hashing (fingerprinting) for this type of content?
  • Why doesn’t Bluesky block known keywords and terms like ‘child pornography’ or ‘cp’?
  • Where can users track the status of reported content and see if it has been reviewed by Bluesky?

At Bluesky, we take child abuse material very seriously. We have automated systems that scan image and video content to preemptively block instances of CSAM, and we have a 24/7 moderation team that responds to reports. Our moderation team removes accounts that engage in this content, and we remove this data from our infrastructure. In the past month, we have increased the size of our moderation team to handle the influx of new accounts.

And here’s a note from Aaron Rodericks, Bluesky’s Head of Trust and Safety, sharing some additional details: We use image hashing to identify known instances of CSAM shared on Bluesky. Additionally, we investigate reports and networks engaged in CSAM, even if the content is not on Bluesky, and remove any involved users from the site.

How to report a post on Bluesky
To report a post, go to the content you want to report, click the three dots (…) and select “Illegal and urgent.” You can include up to 300 characters of context in your report.

Research

Over a two-week period, the investigation mapped 125 Portuguese-speaking profiles on Bluesky that solicited, sold or shared child sexual abuse material (CSAM). Each profile or message contained at least two of the following elements:

  • Requests for sexually explicit images of children or adolescents;
  • Hashtags or keywords with sexual or criminal content;
  • Acronyms or references to “child pornography” and similar terms.

📝

Letícia Oliveira and Tatiana Azevedo are independent Brazilian researchers specializing in online extremism. They have previously contributed to reports on school attacks for the Brazilian Ministry of Education and have collaborated on several investigative pieces with Nuclear.

GROUPS. This gap in the moderation of basic concepts also ensures that posts linking to groups engaged in child sexual exploitation remain active on messaging apps like Telegram.

The messages examined by the investigation contain explicit requests from users to ‘join pedophile groups’. In the comments, users even share their own phone numbers.

In August, Telegram CEO and founder Pavel Durov was arrested in France on suspicion of complicity in crimes on the platform, including the sale of CSAM.

Self-generated CSAM

Last year, Stanford University mapped out the measures tech companies are taking to combat “self-generated CSAM,” a term that refers to sexual media captured and sold online with the consent of minors.

The research found that most platforms struggle to implement effective changes to their moderation algorithms to prevent the publication of such content. X and Discord were highlighted as major distributors of this type of material.

Now, Nuclear identified similar user-generated content shared on Bluesky. One example is a post featuring a sexually explicit photo of an apparent minor, accompanied by the caption “14y (years) here,” which received 90 likes in just a few hours.

New methods

In addition to using well-known terms and keywords, users on Bluesky also use specific emojis associated with certain sports to index child sexual abuse material (CSAM).

The practice of using symbols to represent communities, both online and offline, is not new. In 2007, WikiLeaks published an official document detailing how the Federal Bureau of Investigation was tracking symbols and artifacts used by pedophiles around the world.

Some of the symbols described in that 2007 FBI document have been identified by Nuclear in posts analyzed on Bluesky, while others, particularly those related to sports, are more recent trends.

Oliveira and Azevedo have attributed the use of these emojis to the 2015 arrest of a former American football player who was being held for sexually abusing a child. According to a criminal file obtained by Nuclear and the investigators, the player also filmed the crime.

The former player’s name, his old shirt number and references to the alleged crime video now appear on Bluesky alongside CSAM.

How Moderation Works on Bluesky

The messages and profiles discovered during the investigation violate not only Brazilian law, but also Bluesky’s terms and conditions, which even explicitly prohibit the “normalization of pedophilia” by its users.

Nuclear asked Bluesky about the methods used to identify and remove CSAM from the platform. The app said it uses a combination of automated systems, human moderators working around the clock, and image hashing technology.

Rodericks informed Nuclear that Bluesky also investigates reports of networks involved in child exploitation outside of the platform, with a view to removing any involved users from the site.

In Brazil, local law requires service providers to retain user data for up to six months. In the U.S., the recently passed REPORT Act requires data to be retained for at least one year in CSAM cases reported to the National Center for Missing and Exploited Children (NCMEC). Bluesky said it does not retain images and videos of CSAM beyond what is necessary to report them to authorities.

The failure to retain data related to online crime has previously posed a challenge for Brazilian authorities. For example, during a wave of school attacks and local school shootings in April 2023, platforms failed to retain data on users who promoted violence.

How we did this

Researchers Letícia Oliveira and Tatiana Azevedo contacted the Núcleo reporter to report the presence of child sexual exploitation content on Bluesky. This led to a broader investigation and analysis of the terms, emojis and keywords that the researchers had already mapped. We then checked which posts and comments were still available and contacted the Bluesky press department.

💡

Due to the illegal and disturbing nature of the material, Nucleo has decided not to use screenshots of it, even with redactions. If any organizations or authorities investigating child sexual exploitation would like access to the material for their investigation, they can contact us by email at (email address).
Reporting by Sofia Schurig
Edit by Alexander Orrico And Sergio Spagnuolo

Translated using ChatGPT and reviewed by human editors

You May Also Like

More From Author