Eurochild’s impact on the first report on Article 35.2 of the Digital Services Act
Eurochild’s insights on risks on online platform affecting minors have been integrated into the first report on Article 35.2 of the Digital Services Act (DSA).
The DSA risk management framework in Articles 34 and 35 establishes key obligations for providers of very large online platforms (“VLOPs”) and of very large online search engines (“VLOSEs”). These services have at least 45 million monthly active users in the Union.
Article 35(2) DSA sets out the requirement for the European Board for Digital Services, in cooperation with the Commission, to publish comprehensive reports once a year, identifying and assessing the most prominent and recurrent systemic risks in the Union and in the Member States, as well as best practices for their mitigation.
Published on 18 November 2025, the report was compiled by analysing publicly available DSA risk assessments and independent audits from designated VLOPs/VLOSEs (plus related transparency outputs like ad repositories, content-moderation reports and the DSA Transparency Database), and complementing this with Commission and Digital Services Coordinators studies and evidence.
The Board has also based the report on submissions from civil society, researchers and trusted flaggers.
The impact of Eurochild’s contribution
Eurochild’s contribution to the Report has been reflected throughout the document. The report builds on Eurochild’s recommendation, quoting that “the generation of harmful material using AI-powered technologies is an increasing trend.” In this context, AI-powered technologies can not only facilitate the creation of CSAM and the perpetration of grooming and sexual extortion, but also enable forms of emotional manipulation through AI chatbots and companions.
The report recognises that threats can manifest differently across services, depending on factors such as the platform’s design, features and functions. It also confirms that some of the most recurrent and systemic risks affecting children online include exposure to and dissemination of child sexual abuse material (CSAM) and grooming. In line with our recommendations, the publication addresses harmful content promoting the sexualisation of children, cyberbullying, gaps in content moderation, and the commercial exploitation of childfluencers.
Eurochild’s advocacy on tackling addictive designs and manipulative features is reflected in the report. It highlights risks linked to infinite scroll, autoplay, beauty filters and dark patterns created to maximise user engagement. Indeed, it considers that “several CSOs mentioned that such features and designs may be highly addictive. Other CSOs noted systemic risks related to platform designs set to maximise time spent on an app or nudge users to act in certain ways as to gain attention and validation”. Additionally, the report recognises that recommender systems can amplify the spread of harmful content, in line with calls in our contribution to reduce their hyper-personalisation capacity.
Concerning mitigation measures, the report includes many of our suggestions, such as content moderation and restrictions, limits on interactions between minors and adults, high privacy settings by default and algorithms prioritising age-appropriate content. These measures reflect a safety-by-design approach.
At Eurochild, we will continue working to ensure EU digital policies and legislation reflect children’s rights, including by monitoring developments of the DSA and advocating for the DSA to strengthen children’s rights in the digital world.
For further information, contact Francesca Pisanu, EU Advocacy Officer, and Sofia Montresor, Policy & Advocacy Intern – Online Safety.