FALSE: The EU will require the installation of state software on all smartphones to control citizens

FALSE: The EU will require the installation of state software on all smartphones to control citizens

21 November 2025
FacebookTwitterTelegram
249

Verification within Meta’s Third-Party Fact-Checking Program

A video is being circulated online in which a blogger claims that the European Parliament is allegedly adopting a law that will require the installation of special state software on all smartphones in the EU. According to him, this will be used to monitor citizens’ correspondence. The software will scan every message and image on the phone, transmit data to messengers and providers, and in case of violations — to the police. The blogger presents this as evidence of the introduction of an “iron curtain” in the EU.

However, this is false. The law does not provide for the installation of state software on the devices of EU citizens. It is aimed at preventing sexual violence against children, not at total surveillance of users.

Screenshot of the post 

In the European Union, discussions are indeed ongoing regarding a regulation titled Regulation laying down rules to prevent and combat child sexual abuse (CSAR), which in public debates is often referred to as Chat Control. This is not yet an adopted law, but a draft that the European Commission presented in 2022, and which the European Parliament and the Member States have been discussing ever since. Currently, the document is at the stage of revision: parliamentarians, governments, and technology companies of EU countries are trying to find a compromise between protecting children online and respecting users’ right to privacy.

When the law will be voted on is still unknown — the EU Council vote that had been planned for October 14, 2025 was postponed due to the lack of consensus among Member States.

The essence of the initiative is to impose a legal obligation on internet platforms and messengers to detect and report child sexual abuse material. One technical approach that was considered at an early stage is so-called client-side scanning, that is, checking files and messages before they are encrypted.

Client-side scanning could potentially allow the detection of illegal content without decrypting messages after they are sent. This is because with the use of end-to-end encryption, encryption keys exist only on the devices of the sender and the recipient. As a result, servers have limited ability to check whether a message contains illegal content. Client-side scanning allows this technical obstacle to be bypassed, as the check takes place directly on the device before encryption, and the server receives only a signal about potentially illegal content. At the same time, the messages themselves remain encrypted.

However, this very part raised the greatest concern among experts and human rights defenders, as it could create a precedent for interference in private correspondence. Some European governments and civil society organizations insist that even the fight against pedophilia cannot justify mass monitoring of private messages. Human rights organizations, including European Digital Rights (EDRi), warn that scanning technology could weaken encryption and create risks of abuse.

Considering this, the current versions of the regulation require that any intervention be carried out only on the basis of a court decision or an official warrant. The draft law also no longer includes the mandatory client-side scanning envisaged in the initial proposals. Instead, voluntary detection of materials by platforms continues, but the law formalizes the legal framework: it establishes platforms’ responsibility if they fail to respond to detected content, harmonizes rules across the EU, and creates common standards so that platforms act consistently and transparently.

The current EU rules, which will apply at least until April 3, 2026, allow platforms to voluntarily use existing technologies to detect child sexual abuse material — and only those that do not interfere with end-to-end encryption. For example, these may include algorithms for scanning metadata, file hashes, or signals of potentially illegal content without decrypting messages.

Despite critics’ concerns, the regulation never provided for the installation of state software on citizens’ phones. Therefore, this concerns a narrowly targeted mechanism to counter sexual violence against children, not an instrument of mass surveillance.

The blogger whose video is being circulated on social media distorted these facts. He also compared the EU with Russia, portraying it as allegedly more democratic. In reality, the situation is the opposite: in the EU, the discussion on Chat Control is open, involving human rights defenders, lawyers, technology companies, and civil society. Decisions are made publicly and accountably.

By contrast, in the Russian Federation there has long existed an extensive infrastructure of state internet control. For example, the SORM system (“System for Operative Investigative Measures”). It consists of three levels: SORM-1 intercepts telephone calls, SORM-2 — internet traffic, and SORM-3 covers all types of communications (including Wi-Fi and social networks). By law, all internet providers must install “black boxes” on their equipment that give the FSB and other security services direct access to traffic. And they are not obliged to notify the provider in advance of the interference or to present a warrant.

Thus, in the EU there is an ongoing public process of discussing the CSAR regulation with the involvement of human rights defenders and technology companies to protect children from sexual abuse online. At the same time, in Russia, there already exists an operational system of mass state control of the internet.

Attention

The authors do not work for, consult to, own shares in or receive funding from any company or organization that would benefit from this article, and have no relevant affiliations