Don’t look away: we must protect children from child sexual abuse online
The European Parliament Research Service (EPRS) and the European Council Legal Services have recently issued their assessments criticising the feasibility of the EU proposal to prevent and combat child sexual abuse. These assessments risk the huge progress made by the European Commission in this important fight.
It concluded that the technologies to detect new content and grooming are not yet mature enough to ensure high enough levels of accuracy. Both studies conclude that the regulation would unjustifiably infringe Articles 7 and 8 of the Charter of fundamental rights: ‘Everyone has the right to respect for his or her private and family life, home and communication’, and ‘the right to the protection of personal data’.
These studies are putting at risk a major step forward in the fight against child sexual abuse in the EU. With more than half of CSAM (child sexual abuse material) hosted in the EU in 2022, the lack of technical nor legal feasibility should not be used as an excuse to look away. It should be a mandate for decision-makers to make it possible, as highlighted by the lead rapporteur MEP Javier Zarzalejos at the LIBE Committee when the report was presented.
Few child rights organisations were consulted during the drafting process of the study and Eurochild believes its analysis is lacking a child rights-based approach. The positive and negative impact of the proposed regulation on children’s rights is measured and balanced with the impact on fundamental rights of users. We believe this dichotomy violates the right of children to have their best interests taken as primary consideration, as protected by Article 3 of the UNCRC.
It is concerning that neither study evaluates the seriousness of this crime properly, therefore asking for the scope of regulation to be more targeted in terms of type of material, time limitations, users, etc. The EU has an unprecedented opportunity to move from a voluntary regime to a mandatory one, where tech companies are held accountable to act upon this crime. Restricting the scope of the regulation would mean leaving millions of children unprotected from harm.
The proposal allows the use of automated technology to detect child sexual abuse and exploitation online in specific cases determined by risk, and only when authorised by a judicial authority. Therefore, the proposal does not impose a monitoring obligation of a general nature, but rather limits the possibilities for detection, especially compared to the current scenario in which detection is voluntary and unregulated. It ensures that, for the first time, privacy rights of the user go hand in hand with the right to protection of the child. Moreover, this proposal encourages tech companies to improve their services used by children through risk assessment and mitigation, promoting safety-by-design and making the internet a safer place for children.
It is imperative that we do not lose sight of the objective of this legislation, which is protecting children. We count on decision-makers to listen to relevant actors from civil society who can provide data and guidance in finding the solutions needed to assess this important regulation in its full scope, accounting for known & new child sexual abuse material and grooming.
We cannot fail the victims and survivors of online child sexual abuse. For them, the abuse not only stays in their memory, but is constantly reshared and perpetual. The scale of this problem requires us to act and to use the opportunities offered by technology, just as we do in every other aspect of the digital transformation of societies.
Read our response to the proposal and our response to the main parliament report on the file