EU to vote today on controversial attempt to destroy private messages

The European Union is voting today (September 23) on controversial chat control legislation in a move that security and privacy experts say will destroy private messaging in the EU.

The EU is in a concerted effort to undermine privacy and security by attempting to pass legislation that would force companies to break end-to-end encryption (E2EE). The bloc has proposed the use of “client-side scanning,” a technology that scans files on devices and alerts authorities if anything illegal is detected.

Tune in as we dive deeper into the EU vote that could mean the end of private messaging!


After previous attempts were shot down, the EU has rebranded “client-side scanning” as “upload moderation,” essentially an attempt to force users to agree to client-side scanning if they want to be able to send or upload media files over a messaging platform that otherwise supports E2EE. “Upload moderation” is a clever way to make E2EE essentially irrelevant, while still technically being able to tout support for strong encryption.

Meredith Whittaker, chairman of Signal, called for EU action and criticized the EU for trying to outsmart users. She ignored the mathematical reality that there is no way to maintain secure and private communications while attempting to undermine or circumvent E2EE.

Instead of accepting this fundamental mathematical reality, some European countries continue to play rhetorical games. They have come back to the table with the same idea under a new label. Instead of using the previous term “client-side scanning,” they have given it a new name and now call it “upload moderation.” Some claim that “upload moderation” does not undermine encryption, because it happens before your message or video is encrypted. This is not true.

Rhetorical games are all well and good in marketing or tabloid writing, but they’re dangerous and naive when applied to such a serious, high-stakes topic. So let’s be crystal clear once again: mandating mass scanning of private communications fundamentally undermines encryption. Period. Whether this is done by, say, tampering with the random number generation of an encryption algorithm, or by implementing a key escrow system, or by forcing communications to pass through a surveillance system before they’re encrypted. We can call it a backdoor, a front door, or “upload moderation.” But whatever we call it, each of these approaches creates a vulnerability that can be exploited by hackers and hostile nation states, removing the protection of unbreakable mathematics and replacing it with a high-value vulnerability.

We ask that those playing these word games stop it and acknowledge what the expert community has repeatedly made clear. End-to-end encryption protects everyone and enshrines security and privacy, or it is broken for everyone. And breaking end-to-end encryption, especially at such a geopolitically volatile time, is a disastrous proposition.

Patrick Breyer, former MEP for the Pirate Party and co-negotiator of the European Parliament’s critical stance on the proposal, says the EU will vote on the revised measure today. He then outlines the problems such a measure will cause if adopted.

“Rather than empowering teenagers to protect themselves from sextortion and exploitation by making chat services safer, victims of abuse are being betrayed by an unrealistic bill that, according to the EU Council’s own legal assessment, is doomed to fail in court,” Breyer wrote. “Inundating our police with largely irrelevant tips about old, long-known material will not save victims from ongoing abuse and will actually weaken law enforcement’s ability to tackle predators. Europeans need to understand that if this bill is implemented, they will lose the use of everyday secure messengers – and that means losing touch with their friends and colleagues around the world. Do you really want Europe to become the world leader in tapping our smartphones and mandating indiscriminate blanket surveillance of the chats of millions of law-abiding Europeans?”

“Regardless of the purpose, imagine if the Postal Service simply opened and scanned every letter without suspicion,” Breyer added. “That’s unthinkable. Moreover, it is precisely the current mass screening for so-called known content by Big Tech that is exposing thousands of perfectly legal private chats, overburdening law enforcement, and criminalizing minors en masse.”

The EU acknowledges that the measure infringes on privacy

Interestingly, the EU does not even try to hide that the measures it proposes the most privacy-invasive solution available to him.

The company described its solution in 2022:

At the same time, the detection process would the most intrusive for users (compared to the detection of known and novel CSAM), because text search, also in interpersonal communication, would be the most important vector for grooming.

Even more telling is the fact that EU ministers want to ensure that they are exempt from the chat monitoring legislation. This is the most damning evidence yet that the EU is aware of the implications of its efforts for privacy.

“The fact that EU interior ministers want to exempt police officers, soldiers, intelligence officers and even themselves from chat surveillance scanning shows that they know exactly how unreliable and dangerous the sniffing algorithms are that they want to unleash on us citizens,” Breyer said. “They seem to fear that even military secrets with no link to child sexual abuse could end up in the US at any time. The confidentiality of government communications is certainly important, but so must the protection of corporate communications and of course citizens’ communications, including the spaces that victims of abuse themselves need for safe exchanges and therapy. We know that most of the chats leaked by the current voluntary sniffing algorithms are not relevant to the police, for example family photos or consensual sexting. It is outrageous that EU interior ministers do not want to suffer the consequences of the destruction of digital privacy of correspondence and secure encryption that they are imposing on us.”

Why is the EU pushing for chat controls?

Given the concerns surrounding chat monitoring, many may wonder why the EU is so determined to pass such legislation, especially since the EU positions itself as a proponent of privacy.

In short, chat control is promoted as a way to combat child sexual abuse material (CSAM). Unfortunately, while such a goal is admirable, attempting to address it with chat control legislation is problematic at best.

“Let me be clear about what that means: ‘detecting grooming’ is not simply looking for known CSAM. It’s not using AI to detect new CSAM, which is also on the table. It’s running algorithms that read your actual text messages to figure out what you’re saying, at scale.” — Matthew Green (@matthew_d_green), May 10, 2022.

“It’s going to potentially do this to encrypted messages that are supposed to be private. It’s not going to be good, and it’s not going to be smart, and it’s going to make mistakes. But what’s scary is that once you open up ‘machines that read your text messages’ for whatever purpose, there are no boundaries.” — Matthew Green (@matthew_d_green), May 10, 2022.

The private messaging platform Threema describes the issues further:

Of course, sharing CSAM is an absolutely unacceptable, heinous crime that must be punished. However, before CSAM can be shared online, a child must have suffered abuse in real life, which is what effective child protection should try to prevent (and is not the focus of Chat Control). For this and many other reasons, child protection organizations such as the German Federal Child Protection Association oppose Chat Control, claiming that it is “neither proportionate nor effective.”

Furthermore, there is no way to really know whether Chat Control would actually be (or remain) limited to CSAM. Once the mass surveillance apparatus is in place, it can easily be expanded to detect content other than CSAM without anyone noticing. From a service provider’s perspective, the detection mechanism, which is created and maintained by third parties, essentially acts as a black box.

Experts say there are better options

In Germany’s arguments against the EU’s efforts, Chief Prosecutor Markus Hartmann, head of the Central and Contact Point for Cybercrime North Rhine-Westphalia, said the EU went too far with its proposals. Instead, he said law enforcement agencies should be better funded and supported so they can better combat CSAM using traditional techniques. Other experts agree with Chief Prosecutor Hartmann.

“Child protection is not served if the legislation later fails before the European Court of Justice,” said Felix Reda of the Society for Freedom Rights. “The damage to the privacy of all people would be immense,” he added. “Tamper-free surveillance violates the very essence of the right to privacy and therefore cannot be justified by any fundamental rights assessment.”

“The draft regulation fundamentally misses the point of combating child abuse representations,” stressed computer scientist and Chaos Computer Club spokeswoman Elina Eickstädt (via computer translation). “The draft is based on a gross overestimation of the capabilities of technologies,” particularly with regard to the detection of unknown material.

What happens if the legislation passes?

If the EU succeeds in passing the legislation, citizens will lose access to private communications platforms such as Signal and Threema, as both platforms have indicated their intention to withdraw from the EU.

The issue is expected to come before the EU courts in due course, and experts hope the legislation will be rejected there.

In the meantime (as Matthew Green puts it) (“the most sophisticated mass surveillance equipment ever deployed outside China and the USSR.”) EU citizens will have to make do with “the most sophisticated mass surveillance equipment ever deployed outside China and the USSR.”

You May Also Like

More From Author