news
slide

Strengthening child rights online through the DSA guidelines on minor protection

On 13 May, the European Commission published the draft guidelines on minor protection, which aim to guide the implementation of Article 28 of the Digital Services Act (DSA). While these guidelines are non-binding, they will guide national and EU regulators to enforce a high level of privacy, safety and security for children online. Eurochild contributed to the open call for consultation launched by the Commission, with input from our members, and collaborated with 38 organisations and independent experts from across Europe to publish a joint statement.

A good starting point

Eurochild applauds the child rights focus of the guidelines, which considers children’s participatory rights, their evolving capacities and agency. Putting children’s rights at the centre is the only way to ensure safe and empowering digital experiences for children. The guidelines show an emphasis on system design, which considers not only high privacy and safety default settings but also manipulative and engagement-driven practices that go beyond traditional safety approaches. Building on this strong basis, several provisions and sections of the guidelines can be strengthened.

The need for strong guiding principles: taking into account child participation and vulnerable children

Meaningful child participation and systematic consideration of vulnerable children should be enshrined as general guiding principles, as these issues should lie at the heart of every policy, service and feature developed by online platforms for children. Following the child-rights approach aforementioned, Child Rights Impact Assessments should be included as a core recommendation in any risk review foreseen in the guidelines, which should be conducted in consultation with experts, civil society and children. This is especially important as a self-performed risk assessment may lead to under-estimation of risks faced by children, as shown by the poor quality of the first round of risk assessment reports provided by Very Large Online Platforms (VLOPs) under Articles 34 and 35 of the DSA.

A more thorough approach for recommender systems, content moderation and commercial practices

While the guidelines contain crucial provisions regarding default settings, recommender systems, content moderation and commercial practices, these could be further improved. Recommender systems should prioritise the best interests of the child over maximising engagement, including by exclusively basing recommendation on explicit user signals (i.e., actively liking or sharing content). Content moderation provisions should address the spread of harmful content if seen repeatedly, and set appropriate timeframes and criteria for removal after reporting (i.e., prioritise moderation of content that may exploit children's vulnerabilities). Finally, the guidelines should address commercial practices beyond advertisement, including by prohibiting any form of economic exploitation of children by the platform, i.e., manipulative lootboxes, or by users, i.e., influencers.

Limited provisions addressing AI risks to children

Finally, the guidelines are a good opportunity to complement other instruments in terms of the use of Artificial Intelligence by online platforms, especially regarding AI chatbots. We recommend that the guidelines include provisions to ensure that AI chatbots are easy to turn off or disengage from, and that they do not encourage children towards commercial content or purchases, amplify fake news nor emulate child-like features or interactions. More broadly, any AI system accessible to children should be developed, trained, and used ethically, considering the full spectrum of children's rights.

These guidelines have the potential to significantly improve children's online experiences, by providing clearer standards and good practices for online platforms and regulators. As we call on the Commission to set the bar high in the enforcement of the DSA, Eurochild strongly welcomes the ambition of this first draft and is looking forward to seeing its provisions strengthened.




Related News/Events

slide
1 April 2026

This is sadly no April Fool’s joke: Europe is switching off its detection of child sexual abuse online

On 3 April 2026, online service providers in the EU will no longer be able to detect and remove child sexual abuse content on their platforms. We, a coalition of…
read more
slide
31 March 2026

The “tobacco moment” for social media platforms is here

A landmark US verdict underscores growing evidence that harmful platform design must trigger stronger EU action to protect children online. On 25 March 2026, a Los Angeles jury delivered a…
read more
slide
23 March 2026

Eurochild’s reaction to the Action Plan against Cyberbullying

Eurochild welcomes the integration of our contribution to the Action Plan on Cyberbullying, and calls for its effective implementation. Over the past five years, cyberbullying has been the main reason…
read more