Online platforms' risk assessment reports under the DSA fall short on children's protection
Very Large Online Platforms have published their first risk assessment reports under the Digital Service Act (DSA), revealing significant shortcomings in addressing children's rights. Eurochild alerts that these reports lack transparency, downplay serious risks for children, and propose insufficient safeguarding measures. Online platforms must do better to respect their legal obligations to protect children’s rights.
With clear evidence of violations of children's rights on online platforms piling up every day, the DSA, adopted by the European Union in 2022, is a milestone in the regulation of the digital environment. For the first time, it includes several obligations for online platforms and a number of specific requirements for Very Large Online Platforms - those hosting more than 45 million monthly users in the EU. These companies have to produce an annual report assessing the risks posed by their platform to users, including children, and the measures they take to address them. The DSA builds on the responsibility of online platforms to provide a safe environment for all users, including children, in line with UN General Comment No. 25.
The first round of risk assessment reports was made public in November 2024. Eurochild analysed reports of six of them, commonly used by children. For Facebook, Instagram, and YouTube, the 2024 report was analysed; for TikTok, X and Snapchat, the 2023 report was analysed. All of them have fundamental flaws with regard to children's rights.
An alarming number of shortcomings and blind spots
Crucial risks faced by children online are insufficiently addressed in the reports, such as cyberbullying and contact risks – grooming, sexual extortion, fraud, etc. Even when major risks are recognised by platforms, they tend to be minimised: for instance, TikTok rates the risk of widespread child sexual abuse material (CSAM) as "unlikely”, while reporting more than 850,000 suspected CSAM content to the National Center for Missing and Exploited Children in 2023. Other risks are overlooked, such as issues related to popularity metrics and beauty filters, engagement maximisation or addictive design. In addition, certain platform features that amplify these risks are rarely examined, particularly how recommender systems and algorithmic design can fuel addiction and repetitive exposure to harmful content and contact.
A lack of ambition and a narrow approach to children’s rights
Risk assessments are often superficial. Most mitigation measures are general and minimum safeguards that are far from the protection standards that these platforms could achieve. The three most commonly proposed safeguards—content moderation, parental controls and age verification—are often described generically, without describing how they minimise specific risks to children.
Generally, the reports show a lack of ambition in the mitigation measures presented by companies, which undermines the DSA's objectives. There is little emphasis placed on key aspects such as children's privacy or design choices, particularly concerning addictive features and dark patterns. In most cases, privacy settings are insufficiently protecting children from the risks of sexual extortion, exploitation or cyberbullying. More critically, these reports fail to address the specific risks encountered by vulnerable children, such as children with disabilities or from disadvantaged backgrounds, leaving those most susceptible to be harmed without adequate protection.
A general lack of transparency
The reports lack the detailed data necessary to accurately evaluate the prevalence and level of risk and the effectiveness of the mitigation measures. Even when provided, key data, such as the dissemination of CSAM or cyberbullying, is sometimes redacted from the public version. This contradicts the transparency expected from these documents and makes it difficult to assess their accuracy. More alarmingly, some claims made by the online platforms have been found to be inaccurate or at least highly misleading, as demonstrated by the Center for Countering Digital Hate. This burden of fact-checking platform’s claims should not fall on civil society. Online platforms must also advance in their obligations under Article 40 of the DSA and provide access to data for research purposes.
Online platform accountability is crucial to advance on protecting children under the DSA
Overall, online platform risk analysis fail to accurately represent the lived experiences of children online. They do not provide a thorough assessment of the risks nor provide data to assess their prevalence or the effectiveness of the safeguards deployed, despite their legal responsibility to do so under the DSA. This lack of accountability endangers the rights of children and, more broadly, all users.
Moreover, this cannot set a precedent moving forward in the implementation of the DSA, most notably on their duties under Article 25 (prohibition of dark patterns) and Article 28 (protection of minors). We therefore call for online platforms to fully assume their responsibilities under the DSA and to conduct consistent and rigorous risk assessments, including input from children and civil society.