news
slide

Eurochild’s impact on the first report on Article 35.2 of the Digital Services Act

Eurochild’s insights on risks on online platform affecting minors have been integrated into the first report on Article 35.2 of the Digital Services Act (DSA).

The DSA risk management framework in Articles 34 and 35 establishes key obligations for providers of very large online platforms (“VLOPs”) and of very large online search engines (“VLOSEs”). These services have at least 45 million monthly active users in the Union.

Article 35(2) DSA sets out the requirement for the European Board for Digital Services, in cooperation with the Commission, to publish comprehensive reports once a year, identifying and assessing the most prominent and recurrent systemic risks in the Union and in the Member States, as well as best practices for their mitigation.

Published on 18 November 2025, the report was compiled by analysing publicly available DSA risk assessments and independent audits from designated VLOPs/VLOSEs (plus related transparency outputs like ad repositories, content-moderation reports and the DSA Transparency Database), and complementing this with Commission and Digital Services Coordinators studies and evidence.

The Board has also based the report on submissions from civil society, researchers and trusted flaggers.

The impact of Eurochild’s contribution

Eurochild’s contribution to the Report has been reflected throughout the document. The report builds on Eurochild’s recommendation, quoting that “the generation of harmful material using AI-powered technologies is an increasing trend.” In this context, AI-powered technologies can not only facilitate the creation of CSAM and the perpetration of grooming and sexual extortion, but also enable forms of emotional manipulation through AI chatbots and companions.

The report recognises that threats can manifest differently across services, depending on factors such as the platform’s design, features and functions. It also confirms that some of the most recurrent and systemic risks affecting children online include exposure to and dissemination of child sexual abuse material (CSAM) and grooming. In line with our recommendations, the publication addresses harmful content promoting the sexualisation of children, cyberbullying, gaps in content moderation, and the commercial exploitation of childfluencers.

Eurochild’s advocacy on tackling addictive designs and manipulative features is reflected in the report. It highlights risks linked to infinite scroll, autoplay, beauty filters and dark patterns created to maximise user engagement. Indeed, it considers that “several CSOs mentioned that such features and designs may be highly addictive. Other CSOs noted systemic risks related to platform designs set to maximise time spent on an app or nudge users to act in certain ways as to gain attention and validation”. Additionally, the report recognises that recommender systems can amplify the spread of harmful content, in line with calls in our contribution to reduce their hyper-personalisation capacity.

Concerning mitigation measures, the report includes many of our suggestions, such as content moderation and restrictions, limits on interactions between minors and adults, high privacy settings by default and algorithms prioritising age-appropriate content. These measures reflect a safety-by-design approach.

At Eurochild, we will continue working to ensure EU digital policies and legislation reflect children’s rights, including by monitoring developments of the DSA and advocating for the DSA to strengthen children’s rights in the digital world.

For further information, contact Francesca Pisanu, EU Advocacy Officer, and Sofia Montresor, Policy & Advocacy Intern – Online Safety.




Related News/Events

slide
1 April 2026

This is sadly no April Fool’s joke: Europe is switching off its detection of child sexual abuse online

On 3 April 2026, online service providers in the EU will no longer be able to detect and remove child sexual abuse content on their platforms. We, a coalition of…
read more
slide
31 March 2026

The “tobacco moment” for social media platforms is here

A landmark US verdict underscores growing evidence that harmful platform design must trigger stronger EU action to protect children online. On 25 March 2026, a Los Angeles jury delivered a…
read more
slide
23 March 2026

Eurochild’s reaction to the Action Plan against Cyberbullying

Eurochild welcomes the integration of our contribution to the Action Plan on Cyberbullying, and calls for its effective implementation. Over the past five years, cyberbullying has been the main reason…
read more