news
slide

Learn the facts: key insights on the EU's proposed CSA Regulation

Read more about the key facts and the main questions regarding the proposed Regulation to combat and prevent child sexual abuse.

Some groups are intentionally misleading you about what this proposed Regulation means for Europe. Will this Regulation unleash mass surveillance? How are detection technologies really affecting privacy? Can this technology be used for pervasive objectives by totalitarian governments? Most of these questions build on a misinterpretation of the process established by the proposed Regulation and a misunderstanding of the technology at hand.

The unsolicited contact of an adult to a child with sexual intent and the dissemination of images and videos depicting the sexual abuse of a child are breaches of the child’s right to privacy. This proposed Regulation includes strong safeguards to guarantee the privacy of both children and users is preserved. Detection of child sexual abuse (CSA) is a last resort: it would only happen after a thorough process of risk assessment and authorisation by a national court. The Regulation is built to ensure that privacy intrusions are minimised and are proportionate to the risk of a child suffering from online abuse. These safeguards include high security & privacy standards for detection technologies, judicial competence to issue detection orders, review by national Coordinating Authorities and Data Protection authorities, among others.

The Regulation mandates that only technologies approved by the EU Centre, according to high standards of security and privacy, will be available for detecting CSA. This technology exists and is effective and safe. In fact, many of them are being used today to detect what the detractors of this Regulation claim impossible (over 200 companies already use them). Thorn’s Safer Tool, Google’s Content Safety API, Facebook’s AI Technology, Apple’s Communication Safety Tool, are already deployed at scale to detect for new/unknown CSAM and grooming.

Proactive detection of child sexual abuse is effective and essential in preventing the spread of child sexual abuse, since public reporting will never be sufficient. For example, 96% of the content that YouTube removes is flagged by automated detection technologies and in most cases this is done before the video reaches 10 views. Preventative measures, such as digital literacy and risk assessment and mitigation, are crucial to build a digital environment that is safe-by-design for children but will not stop the proliferation of child sexual abuse online alone. CSAM and grooming detected and reported enable the law enforcement to save children and arrest offenders every day around Europe.

Children’s rights to privacy and protection are not incompatible with the users’ right to privacy. Still, this Regulation has faced fierceful and loud opposition by privacy activists. Don’t be swayed by this: citizens overwhelmingly support the EU Regulation as shown by recent ECPAT & Eurobarometer polls.

If you would like to learn more about how this Regulation is protecting children while safeguarding your privacy, read here the full document provided by the ECLAG Steering Group, of which Eurochild is part.

For any questions, please contact Fabiola Bas Palomares, Policy & Advocacy Officer on Online Safety




Related News/Events

slide
22 May 2024

Digital - Eurochild’s achievements in 2023

Annual Report 2023. According to our latest research, children are going online for the first time at the age of 9.6 years old, despite most online platforms only being accessible…
read more
slide
8 April 2024

New Study Reveals: Children Left Alone To Deal with Online Dangers

Press Release - New research by ECPAT International, Eurochild, and Terre des Hommes Netherlands reveals an inconvenient truth: children often rely on their instincts to navigate the digital world due…
read more
slide
5 March 2024

Joint Statement - Global Digital Compact Stakeholders Informal Consultation

Eurochild joins nine organisations calling for a Global Digital Compact (GDC) that prioritises the promotion, protection and implementation of children's rights in the digital environment. The UN Global Digital Compact aims…
read more