Speed ​​limit, veggie day, gender star – in many places we Germans see our freedom, for which we fight so passionately, threatened. All too often this ends in slightly liberal buzzword debates. But at a point where our freedom rights are threatened as massively as seldom before, it is remarkably quiet: With the so-called chat control, the European Commission has presented a draft law that will shake our freedom of communication, our digital mail secrecy and our privacy to the core.

If the draft is passed, the EU Commission should comprehensively control communication on the Internet: All messages and other content could be monitored. The EU Commission wants to use this to find depictions of abuse by children and young people. With the intention of preventing child abuse, the EU Commission is creating the most blatant case of mass surveillance without cause that we have seen since the NSA affair.

Only this time it’s not foreign secret services that are spying on us, it’s the European Union. And even child protection organizations describe the proposal for chat control as disproportionate because of the comprehensive scans of private communication.

Of course, there is literally no mention of mass surveillance in the Commission draft. Instead, the “Ordinance to prevent and combat the dissemination of depictions of child abuse” prefers to talk about ways to identify and remove certain content in digital services.

It’s not just about chats that are conducted on messengers or via direct messages – encrypted or unencrypted. But also e-mails, chat messages in online games or content in app stores and hosting providers of all kinds, such as content in the cloud.

All of these providers are ordered to automatically detect known or new depictions of child abuse by constantly scanning messages. Text is also to be screened and evaluated in order to detect grooming – contact between adults and minors in order to initiate sexual abuse.

However, it is completely unclear how this is supposed to work technically and in detail. How can you distinguish between pictures that are beach photos of a child posted to the family WhatsApp group from summer vacation – or child pornography? There is often a fine line between holiday photos showing children running around naked on the beach and child pornography. Or like text messages, is it consensual sexting between two 16-year-olds – or is it grooming? These fundamental questions remain unresolved.

Without a far-reaching obligation to identify all communication participants and precise knowledge of the context in which an image is sent, even the best filter or scanner is often wrong. The industry standard quoted by the EU Commission itself assumes an error rate of twelve percent for text recognition. With billions of messages sent every day, that quickly adds up.

Debates in the field of tension between freedom and security rarely take place on an objective basis. The question of whether we want or can guarantee more freedom or more security can never be resolved completely satisfactorily for both sides. But the parliamentary group of safety logic has overdone it with this bill.

All sides affirm that encrypted communication can remain encrypted. However, private communication is always private, whether encrypted or not. Digital civil rights are not second-class civil rights – neither is the digital secrecy of correspondence. After all, the state does not steam up letters or listen to masses of telephone calls.

Network activists are accused of only wanting to protect their privacy and not the protection of children. That’s unfair. Also because the three usual security policy narratives can apply to both sides. The first narrative: the Internet is not a legal vacuum. This means that criminal law applies, of course, but also civil liberties. The second thing: what applies in the analogue world must also apply in the digital world. The secrecy of correspondence applies both to postal mail and to the Internet.

Thirdly, security authorities must be able to act on an equal footing with criminals. So instead of wasting resources chasing down false suspicious reports and immobilized in data junk, security agencies should have accurate and efficient digital tools at their disposal.

The reports of the past few weeks that more and more depictions of abuse are being found on the European part of the Internet leave no one indifferent. Internet activists have therefore been demanding for years that depictions of abuse recognized as such should be deleted instead of blocking them and merely putting a stop sign on websites. The only way to stop the spread of images once they are online is to delete them.

Despite all the fascination for technology and the hope that it will solve social problems, classic investigative work must not be forgotten. The case that the special police investigation group “Berg” has been investigating in a widespread abuse complex since October 2019 is prominent. In the house of a man from Bergisch Gladbach, vast amounts of child pornographic data were found. Through him, the investigators came across other perpetrators who exchanged videos and images of severe sexual child abuse online.

These successes show how capable investigative authorities are even without new powers and reporting obligations. For this purpose, technology must be used across the board in order to provide support and relief, for example, in the psychologically extremely demanding investigative work. AI-supported tools for evaluating image material or recognizing networks are conceivable.

The Commission draft is silent on all of this. Instead, he comprehensively questions private communication – and this is the core of our freedom debate. Because our freedom rights are really in danger.

Ann Cathrin Riedel is chairwoman and Teresa Widlok deputy chairwoman of LOAD e.V. – association for liberal internet policy.