Telegram ignored help from child safety watchdogs, groups say – NBC 5 Dallas-Fort Worth

Before Telegram’s CEO was arrested in France, the app had a reputation for ignoring advocacy groups fighting child exploitation.

Three of those groups, the National Center for Missing & Exploited Children (NCMEC) in the U.S., the Canadian Centre for Child Protection and the Internet Watch Foundation in the U.K., told NBC News that their communications with Telegram about child abuse material, often abbreviated as CSAM, have been largely ignored on the platform.

Pavel Durov, co-founder and CEO of Telegram, a messaging and news app widely used in former Soviet countries and increasingly popular among the far-right in the US and groups banned from other platforms, remains in custody of French authorities, who arrested him on Saturday.

The Paris prosecutor, who has not yet announced charges, said Monday that Durov was arrested as part of an investigation into an unnamed individual. The charges against the individual include “complicity” in illegal transactions and possession and distribution of child abuse material, the prosecutor said in a statement.

Telegram wrote in a statement about X that it complies with European Union law. It said Durov has “nothing to hide” and that it is “absurd to claim that a platform or its owner is responsible for the abuse of that platform.”

Telegram has long branded itself as relatively unmoderated and unwilling to cooperate with law enforcement. Durov said in April that it had 900 million regular users.

John Shehan, senior vice president of NCMEC’s ​​Exploited Children Division & International Engagement, said he was encouraged by France’s decision to arrest Durov because Telegram has been such a haven for child sexual abuse.

“Telegram is really in a league of its own when it comes to the lack of content moderation or even interest in preventing child sexual exploitation on their platform,” he said.

“It is encouraging to see that the French government and the French police are taking measures to potentially tackle this kind of activity,” Shehan said.

Telegram’s website states that it never responds to reports of any kind of illegal activity in private or group chats, “even if reported by a user.” It also states that unlike other major tech platforms, which routinely comply with court orders and warrants for user data, “we have released 0 bytes of user data to third parties, including governments.”

NBC News reached out to Telegram for comment on the groups’ claims that their efforts to flag CSAM have been ignored. In a statement, Telegram spokesperson Remi Vaughan did not address their comments but said the platform “actively moderates harmful content on its platform, including child abuse material.”

“Moderators use a combination of proactive monitoring of public areas of the platform, AI tools, and user reporting to remove content that violates Telegram’s Terms of Service,” Vaughan said. Telegram maintains a channel that provides daily updates on how many groups and channels have been reported for child abuse, and claims that thousands of public groups are banned daily.

In a report last year on platforms’ enforcement of child sexual abuse, the Stanford Internet Observatory noted that while Telegram says it violates rules for sharing child sexual abuse content in public channels, it is the only major tech platform whose privacy policy does not explicitly prohibit child sexual abuse or the luring of children into private chats.

Under the law, U.S. platforms must work with NCMEC, which operates the world’s largest international coordination center between law enforcement, social media platforms and tipsters, to flag confirmed abusive material for swift removal. Telegram is based in Dubai, United Arab Emirates, which Durov, who was born in the former Soviet Union, says is a neutral country that doesn’t make its platform dependent on any government.

But major tech companies outside the U.S., including TikTok, which is owned by Chinese company ByteDance, Britain’s Fenix, which owns OnlyFans, and Canadian conglomerate Aylo, which owns Pornhub, all remove child sexual abuse content flagged by the NCMEC, Shehan said.

Telegram offers what it describes as an option to end-to-end encrypt private messages, meaning that only users, not the platform, can read them. But while other end-to-end encrypted messaging services, such as WhatsApp, allow users to report and forward illegal content, Telegram has no such option.

NCMEC has received a total of 570,000 reports of CSAM on Telegram, Shehan said. The app was launched in 2013.

“They’ve made it really clear to the team that they’re not interested. We’ve been reaching out sporadically, but it’s not happening as often anymore,” he said. “They’re not responding at all.”

A spokesperson for the UK-based Internet Watch Foundation, an independent non-profit organisation working to stop the spread of CSAM, said there have been repeated attempts to work with Telegram over the past year, but that Telegram has refused to “shut down any of its services to block, prevent and disrupt the sharing of child sexual abuse images.”

“There is no excuse,” said the group’s deputy CEO, Heidi Kempster. “All platforms have it in their power to do something now to stop the spread of child sexual abuse images. We have the tools, we have the data, and any failure to stop this well-known content from spreading is an active and deliberate choice.”

Stephen Sauer, director of the Canadian national CSAM hotline at the Canadian Centre for Child Protection, said in an email statement that Telegram is not only ignoring attempts to report CSAM, but that abusive material is increasingly appearing there.

“Based on our observations, Telegram’s platform is increasingly being used to make CSAM available to offenders. In many cases, we see Telegram links or accounts advertised on web forums and even on US mainstream social media platforms that act as a funnel to drive traffic to illegal Telegram-based content,” he said.

“Telegram’s moderation practices are completely opaque — we have absolutely no idea how they work. We also receive no confirmation or feedback from the company about the outcome of moderation when we report content. More importantly, it appears that the platform itself is not taking sufficiently proactive steps to curb the spread of CSAM on its service, despite its known use to facilitate the exchange of such material,” Sauer said.

This story first appeared on NBCNews.com. More from NBC News:

You May Also Like

More From Author