Chat Control: what is at stake for encrypted communication in Europe

The EU’s proposed rules to combat child sexual abuse online have triggered one of the most important privacy debates in Europe.

No one disputes the goal. Preventing child abuse and stopping the spread of illegal material is a legitimate and necessary objective. The real question is how this should be done—and whether the chosen method could undermine private communication for everyone else.

For providers of encrypted services, the concern has always been the same: if platforms are pushed to scan private communications, especially before encryption is applied, the practical protection of end-to-end encryption is weakened.

The core issue

The debate is not about whether abuse should be fought. It is about whether broad detection and scanning duties can coexist with secure private communication.

Critics of the proposal have warned that measures resembling client-side scanning would move inspection onto the user’s device, before encryption can fully protect the content. Supporters argue that stronger tools are needed to detect abuse online. That tension is at the heart of the debate.

Why this matters

For private messaging and collaboration tools, trust depends on a simple expectation: the provider should not have routine access to the substance of your communication.

If regulation pushes service providers toward scanning obligations that reach into private messages, photos, or shared documents, that changes the security model. It affects not only consumer chat apps, but also privacy-first services used for legal work, advisory work, internal strategy, and other sensitive collaboration.

The risk for encryption

End-to-end encryption is valuable because it limits who can read the content of a communication. If content must be checked before that protection applies, privacy advocates and many security experts argue that the result is not neutral regulation, but pressure on the encryption model itself.

Even where lawmakers try to narrow the scope, the line matters: rules aimed at detecting abuse should not quietly become rules that normalize broad monitoring of private communications.

Why Qaxa is paying attention

Qaxa is built on the assumption that private work should remain private. That includes messages, files, notes, and sensitive professional coordination.

We follow this debate because any regulatory framework affecting encrypted communication in Europe matters to the people we serve: lawyers, advisors, founders, operators, and teams handling work that should not be exposed by default.

Privacy is not an obstacle to safety. But regulation should be careful not to weaken the basic infrastructure of trusted communication in the process.

Update (January 2026): Since this article was first published on 19 September 2025, the legislative process has moved forward. On 26 November 2025, EU member states agreed the Council’s negotiating position on the proposed regulation. The Council text emphasizes risk assessment and mitigation obligations for providers, would make the current voluntary scanning regime continue beyond its current expiry, and would establish an EU Centre on Child Sexual Abuse. The European Parliament’s position, adopted in November 2023, called for effective measures without mass surveillance and said end-to-end encrypted material should be excluded from detection. The file is now in trilogue negotiations between Parliament, Council, and Commission. Separately, on 19 December 2025, the Commission proposed extending the Interim Regulation so providers could continue voluntary detection and reporting beyond 3 April 2026 while negotiations continue.

Keep reading the blog
Follow us on X for updates