EU member states have agreed on a position for online child protection legislation that eliminates requirements for global tech companies to scan and remove child sexual abuse material (CSAM). This development, reported by Reuters, represents a significant outcome for companies such as Google and Meta.
The European Council’s stance diverges from the European Parliament’s 2023 position. That earlier proposal mandated messaging services, app stores, and internet service providers (ISPs) to report and remove CSAM as well as instances of grooming. Under the new framework, no such reporting or removal obligations exist for these entities.
The legislation shifts responsibility to major tech companies. These firms must evaluate the risks associated with their services and implement preventative measures where necessary. This approach emphasizes proactive assessment over mandatory detection and elimination processes.
Enforcement mechanisms fall under the purview of individual national governments rather than a centralized EU authority. Member states will appoint designated national authorities tasked with reviewing the risk assessments and mitigation strategies submitted by tech providers.
These national authorities hold the authority to require providers to execute specific mitigating actions if deemed essential. The European Council outlined this structure in a statement: “Member states will designate national authorities … responsible for assessing these risk assessments and mitigating measures, with the possibility of obliging providers to carry out mitigating measures.”
Failure to adhere to these directives carries financial consequences. Providers that do not comply may incur penalty payments, ensuring accountability at the national level while avoiding uniform EU-wide penalties.
The proposed text contains no provisions for enforced scanning of encrypted materials to detect CSAM, a concept under discussion as recently as last year. Discussions had explored ways to balance child protection with privacy, but the current version omits such mandates.
Provisions do address the protection of encryption services. The language specifies that encryption must be safeguarded, aiming to preserve secure communication channels amid ongoing debates about surveillance.
Opposition has emerged from certain quarters, including the Czech Republic. Critics argue that permitting tech companies to self-regulate content moderation could undermine encryption platforms. This self-policing model raises concerns about unintended erosion of privacy safeguards.
Czech politician Markéta Gregorová expressed strong reservations in a statement. She described the compromise as “a great disappointment for everyone who cares about privacy.” Gregorová further criticized the Danish presidency’s role, noting it “has pushed through a compromise version of the proposal after long negotiations, which, while appearing to be less invasive, actually paves the way for what we have long warned against: the blanket scanning of our private conversations.” Her comments highlight fears that the agreement could enable broader intrusions into personal communications over time.
The legislation introduces the EU Center on Child Sexual Abuse as a supportive entity. This center will assist member states in meeting compliance requirements and offer aid to victims of such abuse, providing resources for detection, prevention, and recovery efforts.
Separately, the European Parliament has advocated for establishing minimum age limits for children’s access to social media platforms. This call seeks to restrict exposure to potential harms, though no dedicated legislation addressing age verification is advancing at present.
The Council’s position requires further deliberation. Negotiations between the Council and the Parliament remain pending, meaning the proposal has not achieved final approval.





