Eurochild's position on age restrictions on social media
A call to rethink the business model of social media to address risks for children.
The debate on age restrictions on social media should be used to push for substantive reform of how social media operates, and discuss how companies can prioritise children’s rights over profit and uphold them as a condition for operating in European countries. The choice is not simply between a “ban” and “no ban”. That framing obscures the real issue.
The real choice is whether we accept a digital environment designed around profit and attention capture, or insist on platforms that are accountable, transparent, and safe-by-design for children. This paper argues that age restrictions alone won’t keep children safe, so the EU should prioritise children’s rights-based, safe-by-default regulation that tackles platforms’ risk-driving business models and design choices, making platforms safer for children and therefore safer for everyone.
Age restrictions alone won’t keep children safe unless platforms are held accountable for risk-driving business models and design choices (attention extraction, profiling, addictive features) through child-centred and safe-by-design legislation and enforcement. We do not call for a blanket ban, but for a rights-based framework where any age-gating is necessary, proportionate and privacy-preserving, paired with stronger independent risk assessments, researcher access to data, and states investing in offline support rather than outsourcing children’s rights to platforms.
Key messages and recommendations:
- Age restrictions can never replace regulation or company responsibility
Even if age limits exist, they will not address harms on their own. Existing rules (like the DSA, GDPR and AI Act) must be enforced strongly, and new rules (including on online child sexual abuse) are vital because harms happen beyond “social media”, and they are solely driven by whether a child can access given platforms. - Children’s rights apply to everyone under 18, with safeguards that evolve by age
Your rights don’t switch off because of age gates. Safeguards should increase with younger age, but older teens should be empowered (evolving capacities). - Platform power requires structural accountability
Children might experience social media differently, but it is neither fair nor realistic to expect every child and caregiver to “self-manage” services intentionally designed to be hard to disengage from. Protective design must be required by law and set as the default. - Data extraction for behavioural advertising and engagement optimisation must end
The current business model treats children’s identities, emotions, and behaviours as monetisable assets. While companies must not use data of minors for commercial practices, it is also crucial to raise awareness around the issues of sharenting and childfluencers, protecting children’s privacy, dignity and protection from exploitation. - The business model must change: reduce harm at the source
Eurochild calls for action against features that fuel compulsive use and risky exposure (e.g., infinite scroll, autoplay, manipulative nudges), or illegal content. Social media must be completely revolutionised and monitored independently. - Social media must be safe by default
Some people will always get around age checks. That’s exactly why platforms must not keep a “wild west” experience for anyone who isn’t logged in. High privacy and safety should be the default for everyone. - Regulation must be strengthened to make risk assessments independent and much more robust
Independent, detailed and compulsory standards is needed, to robustly review risks and set proportionate minimum-age and age-assurance requirements using privacy-preserving technology. - Independent access to platform data is essential, and more research is needed
After years of controversies and scandals, trust cannot be rebuilt without independent scientific scrutiny and interdisciplinary research combining survey data with objective platform data and (where appropriate) neuroscientific evidence. Researchers must be given meaningful access to platform data. - Age assurance and verification should protect privacy and avoid discrimination
If age checks are used, they must be reliable, non-intrusive and non-discriminatory. Eurochild points to the EU Digital Identity Wallet as a potentially more privacy-preserving option—but only if tested properly and safe for marginalised children. - Governments can’t outsource their responsibilities to platforms
If social media is filling gaps (safe spaces, youth services, mental health support), that’s a warning sign. States must invest in real offline and online support, platforms can’t replace public responsibility.