The European Parliament has rejected a proposal to extend a special provision that allowed digital platforms to detect illegal content within users’ private communications. The measure was a temporary exception to the ePrivacy framework and expired in early April 2026.

 

Following the vote, a majority of lawmakers opposed the extension, with more than 300 members voting against and significantly fewer supporting the initiative.

 

Tech companies argue for continued safeguards

 

Major technology companies, including Google, Meta, Microsoft, and Snapchat, opposed the decision. They emphasized the importance of maintaining automated tools designed to detect illegal material, particularly content related to child sexual abuse (CSAM).

 

According to industry representatives, these systems do not involve traditional reading of messages. Instead, they rely on hash-matching technology, which converts files into digital fingerprints and compares them against databases of known illegal content. Companies argue that such methods are both effective and essential for supporting law enforcement efforts.

 

Lawmakers prioritize privacy protections

 

Most members of Parliament, however, took a different stance. They stressed that large-scale analysis of private communications poses significant risks to fundamental rights.

 

In their view, such practices undermine trust in digital services and threaten the confidentiality of personal communications. The debate ultimately centered on balancing security needs with the right to privacy.

 

Negotiations reach a deadlock

 

Efforts to establish a long-term regulatory framework between the European Parliament and the Council of the EU failed to produce an agreement. The European Commission had proposed extending the transitional period to allow further discussions.

 

Lawmakers, however, called for stricter limitations and shorter timelines. As a result, no compromise was reached, and the legal basis for voluntary message scanning has lapsed.

 

What comes next

 

Despite the absence of formal authorization, technology companies have stated they will continue implementing their own safety measures across their platforms.

 

The decision signals a clear shift in priorities: the protection of personal data and communication privacy is taking precedence, even in the context of combating illegal content.

Don’t miss interesting news

Subscribe to our channels and read announcements of high-tech news, tes