Experts: European Union's 'chat control' plan a blow to free speech

The EU is advancing a plan to require messaging apps to scan all user messages for child sexual abuse material — a move experts warn undermines fundamental rights.
The regulation, led by the Council of the European Union and commonly known as "chat control," rests on two pillars: the preemptive scanning of content and the verification of users' ages. This would require messaging platforms to analyze content directly on a user's device before it is encrypted and sent. According to Oliver Laas, a philosopher at Tallinn University, this alone would render end-to-end encryption essentially meaningless.
"It's roughly the equivalent of a police officer reading your letter before you put it in an envelope and send it. That's no longer the kind of communication where only the sender and recipient can access the message, which is the whole point of end-to-end encryption," Laas explained.
An equally significant change would be the mandatory age verification requirement, which Laas says would effectively put an end to anonymous communication online. This combination, he warns, would not only violate privacy but also establish an unprecedented surveillance infrastructure that could easily be abused in the future.
Identification as the end of anonymity
The draft regulation calls for age verification on platforms deemed to pose potential risks to children and minors — that is, technically, on all messaging apps. While the European Union would likely implement a system similar to its digital wallet, meaning users wouldn't need to send personal data directly to the messaging platforms, Oliver Laas believes this would still significantly undermine user anonymity.
"Essentially, the third party verifying your identity — whether it's the police, the state or another authority — would be able to determine who is behind a pseudonym or anonymous username. That means anonymity on the internet would no longer exist," Laas warned.
The loss of anonymity would, in turn, deal a serious blow to freedom of expression. According to the philosopher, it would particularly constrain those expressing unpopular or government-critical views. For example, it would complicate the work of whistleblowers and investigative journalists communicating with their sources.
Paradoxically, age verification could end up harming those it is meant to protect. Toomas Krips, a lecturer in cryptography at the University of Tartu, pointed to the United Kingdom as a clear example. In July, the UK began requiring websites with adult content to verify the age of visitors.
"Statistically, the person most likely to abuse a child is a member of their own family. But if we place age verification barriers on internet traffic, we already have cases where children can no longer access pages and resources that could help them in difficult situations — like the Reddit community r/sexualassault for survivors of sexual violence," Krips noted.
Breaking encryption and the threat of cyberattacks
The purpose of end-to-end encryption is to ensure that only the sender and recipient can read the messages being exchanged. The new regulation would force service providers to build so-called backdoors into their apps, allowing content to be scanned before encryption. According to experts, such a system would create a new and highly attractive target for cybercriminals.
"If such a system were added, it would be a very tempting target for all kinds of bad actors who could then gain much easier access to all communications," said Toomas Krips. Oliver Laas added that, in effect, this would force software developers to turn people's own devices against them by making them spy on their users.
He pointed to a cautionary example from the United States, where similar legally mandated backdoors have already been built into telecom networks. "It was recently revealed that a Chinese hacker group took over the very backdoors intended for the FBI and other agencies and used them to eavesdrop on network traffic for months," Laas noted.
Both experts agree that the idea of granting access to encrypted messages only to the "good guys" — while somehow keeping "bad guys" out — is laughably naive.
False positives and ruined lives
Machine-learning-based scanning systems used for this type of content detection are notorious for their inaccuracy. Toomas Krips cited data from Swiss police, where a similar system is already in use. "Of all the reports they receive, 80 percent are completely irrelevant from a criminal standpoint — in other words, 80 percent are false positives," the cryptographer noted.
Such misclassifications aren't just harmless statistics; they can seriously disrupt people's lives. Oliver Laas cited a case in which a father sent a photo of his child's skin rash to a doctor — only for Google's automated system to wrongly classify it as child pornography. Although police later dropped the investigation, the man lost access to his Google accounts for an entire year.
A large portion of false positives also come from teenagers sharing nude photos with one another. With automatic scanning, those images are first sent to third parties. In the worst cases, this can lead to baseless criminal investigations against the very minors the system is meant to protect.
"We don't actually have reliable statistics or research showing that mass surveillance of this kind even works. What we do have — from mass telecommunications data collection and corporate practices — is evidence showing no positive correlation between these efforts and actual crime-solving rates," Laas added.
Estonia and other states firmly opposed
Due to the risks outlined above, the draft regulation has also faced strong political opposition. According to Henrik Trasberg, a legal policy adviser at the Ministry of Justice, it is unlikely the proposal will move forward in its current form.
Estonia formed its position on the matter three years ago. Trasberg explained that while Estonia supports the overall goal of preventing child sexual abuse and agrees with certain measures — such as mandatory age verification — breaking encryption is a red line.
"What we have not supported is any part of the regulation that would undermine end-to-end encryption or require the implementation of backdoors or other methods for analyzing encrypted content. We've been opposed to that from the beginning," Trasberg said.
He added that several other member states share Estonia's position, including Germany, the Netherlands and Poland. The European Parliament also does not support the proposal. "This issue has been around for three years now and that's precisely why the initiative hasn't moved forward. The opposition from member states has been strong enough to stall it," Trasberg explained.
Slippery slope to authoritarianism
Both Oliver Laas and Toomas Krips warn that the surveillance infrastructure being built under the guise of child protection could easily be expanded. "If, for example, a more authoritarian regime were to come to power, it would be very easy for them to use this structure to track down politically undesirable individuals," Krips said.
Laas agreed, pointing to the broader global decline of democracy. He cautioned that in the future, the definition of "extremist content" could be expanded to include, for example, the political views of current ruling parties. "If someone with Orban-like instincts — but more competent — were to come to power in Estonia in the next decade, such a surveillance system would make it very easy to sideline opponents," he added.
Laas also rejected the commonly used argument that "honest people have nothing to hide." "If you've got nothing to hide, my email is public online. Please send me the passwords to all your accounts. Let me take a look at what you've got. I'm quite certain that none of these so-called 'honest' people will actually send me their passwords," the philosopher said.
According to Laas, this line of thinking reflects a broader misconception about privacy: "Thanks to lobbying by security agencies and governments, we've come to associate privacy with secrecy, shame or having something to hide. In reality, privacy is about the individual's right to control what information they share and with whom."
Therefore, the belief that someone has "nothing to hide" should not mean giving up the right to decide what personal information is disclosed and to whom. Laas also argued that neither the state nor international organizations should know more about citizens than those citizens have voluntarily chosen to share. "It's especially telling that under the proposed surveillance rules, security services would be exempt from chat control. That shows they fully understand what they're trying to impose on the rest of the population — they just don't want to be subjected to it themselves," he added.
As an alternative to mass surveillance, 48 child protection and human rights organizations have issued a joint statement recommending that law enforcement agencies stick to targeted surveillance — only with court approval and applied to specific suspects.
According to these organizations, far more effective child protection measures would include developing platforms and applications that prioritize user privacy — such as through end-to-end encryption — while also strengthening child protection and support services, improving media and digital literacy among minors and expanding access to sex education. These efforts would help children identify signs of abuse early and ensure safe, accessible reporting channels.
--
Editor: Marcus Turovski










