The “tobacco moment” for social media platforms is here
A landmark US verdict underscores growing evidence that harmful platform design must trigger stronger EU action to protect children online.
On 25 March 2026, a Los Angeles jury delivered a landmark verdict against Meta and YouTube. The jury found both companies liable for the mental harm suffered by a young woman who began using the platforms as a child, and that both companies acted with malice and were negligent in the design of their platforms.
It apportioned responsibility at 70% to Meta and 30% to YouTube, and found that punitive damages were warranted. Jurors concluded that the companies were negligent, that their negligence was a substantial factor in the harm suffered, and that they had failed to adequately warn users of the risks. On the same week, a New Mexico Jury found Meta liable for misleading consumers about the safety of its platforms and endangering children.
For Eurochild, this verdict confirms what we have said consistently, including in our position paper Age restrictions on social media - A call to rethink the business model of social media to address risks for children. The core problem is not simply whether harmful content appears on platforms. It is that many platforms are designed in ways that maximise engagement and profit, often at the expense of children’s rights, wellbeing, privacy and safety. Eurochild’s position paper argues precisely for addressing the business models and design choices that drive risk.
Some companies have chosen to settle similar claims before trial. But the broader point remains unchanged. This is not about one company, one plaintiff or one courtroom. It is about a systemic model that has too often rewarded attention extraction, compulsive use and commercial gain over children’s best interests.
What makes this case particularly significant is that it focused on product design. Plaintiffs argued that features such as infinite scroll, autoplay and other engagement-driven mechanisms helped keep children hooked. That distinction matters. It shifts the discussion away from the idea that platforms are merely passive hosts of user content and towards corporate accountability for services deliberately engineered to shape behaviour.
This matters especially for children. Young users are still developing, and digital services should not be built in ways that exploit developmental vulnerabilities. Designs based on constant prompts, frictionless and endless consumption, and unpredictable rewards can make disengagement harder. The burden cannot continue to fall on children and parents alone to navigate systems designed to keep them online for as long as possible. The responsibility must lie first with the companies creating and profiting from these environments.
Europe is not standing still. The Digital Services Act already requires online platforms accessible to minors to protect their mental and physical wellbeing, privacy and security. In February 2024, the European Commission opened formal proceedings against TikTok under the DSA. More recently, in February 2026, the Commission preliminarily found TikTok in breach of the DSA for addictive design, including features such as infinite scroll, autoplay, push notifications and its highly personalised recommender system. The EU has therefore already begun to recognise that harmful design is not incidental to the problem, but central to it. Furthermore, last week, the European Commission has preliminarily found that Pornhub, Stripchat, XNXX and XVideos may have failed to properly prevent children’s access to pornographic content, while Snapchat is also under investigation over child safety concerns.
This verdict should be a wake-up call for Europe. Enforcement must be ambitious, timely and firmly grounded in children’s rights. Platforms accessible to children should be safe and rights-respecting by design. That means privacy by default, robust risk mitigation, strong limits on exploitative and manipulative features, and real accountability when companies fail.
The ‘tobacco moment’ for social media platforms is here: a moment when mounting evidence, public scrutiny and legal accountability are exposing how profit-driven design choices can cause harm, especially to children. And 90% of people in the EU see protecting children online as a political priority. The question now is whether EU policymakers are ready to act on it. At Eurochild, we stand ready to continue supporting them.