news
slide

Age assurance: a silver bullet?

Age assurance has been in the spotlight for some years now, often presented as a key step to implement certain legal instruments regarding the protection of children online, i.e. the Digital Services Act (DSA). While this might be true, it should not be seen as a miraculous solution, but as one of the measures in the wider toolbox needed to promote safer experiences for children online.

Age assurance comprises all mechanisms designed to estimate or verify the age of a user (be it a range of years – i.e., 13-18 – or a threshold – i.e., below or over 18), using a wide variety of methods - self-declaration, credit cards or ID documents, facial age estimation, behavioural profiling, etc. According to ECPAT International, Eurochild and Terre des Hommes Netherlands’ latest research, almost 60% of children supported online platforms to verify their age.

In a moment where digital policy debates are becoming unnecessarily technical and complicated, age assurance shall not become an excuse to gate keep children from most digital spaces. While necessary to implement the DSA, age assurance shall not be implemented in isolation: it must be regarded as a solution that sits within a wider range of measures preventing harm and empowering children online.  

Despite the technical challenges that may exist, age assurance mechanisms must be child-rights respecting, paying specific attention to their effect on participatory and privacy rights, while remaining effective and realistic.

Firstly, to ensure age assurance tools respect the full range of children’s rights, it is key to ensure they are proportional to the risk and it is designed and operationalised in the best interests of the child. A recent study commissioned by the European Commission sets out 10 requirements that need to be balanced when determining the method of age assurance to be implemented, such as proportionality, privacy, accuracy, non-discrimination, participation, transparency, among others. However, an exercise of balancing is needed. Proportionality considerations may help determine the balance between accuracy and privacy (i.e., a kid’s platform may not need the same level of accuracy and hence privacy-intrusion than one with adult content). Similarly, assessing children’s best interests may shed some light resolving conflicts between participation and safety (i.e., safety needs may justify children’s restricted access to adult content, hence limiting their participation).

Secondly, it is worth noting that the primary responsibility for implementing appropriate age assurance lies with the service provider. As part of the duty of care that online platforms have with regard to their child users, age assurance should be a legal imperative for services likely to be accessed by children. However, this does not mean that platforms should be the ones processing the data needed to verify the users age, especially given the susceptibility of children’s data and their potential use for commercial purposes. Similarly, it does not mean it should be online platforms developing these technologies, due to their incentive to minimise the costs of such an expensive technology.

The Spanish Data Protection Authority (AEPD) has become pioneer in this space with the development of an age verification prototype – an app that securely scans an official identity document and generates a digital token provided to the site requesting the verification, which does not contain other information than the age requirement is fulfilled by the user. It is becoming increasingly relevant to ensure age assurance solutions are standardised and homogenised at European level to avoid asymmetries across Member States.

In this complex environment, Eurochild highlights the need for the harmonization of the age for children’s consent across the EU and provision of harmonised GDPR guidelines for children’s data. More importantly, to ensure age assurance solutions are  child-rights respecting and effective, there is an urgent need for technical standards that lay down minimum principles of data minimization, privacy preservation, data security, accessibility and respect the best interests of the child.

For more information, please contact Fabiola Bas Palomares, Lead Policy & Advocacy Officer on Online Safety.




Related News/Events

slide
24 May 2024

Children want the EU to step up their online protection

The VOICE project, led by ECPAT International, Eurochild and Terre des Hommes Netherlands, consulted 483 children on online safety in 2023. As many children can’t vote in the EU, their…
read more
slide
22 May 2024

Digital - Eurochild’s achievements in 2023

Annual Report 2023. According to our latest research, children are going online for the first time at the age of 9.6 years old, despite most online platforms only being accessible…
read more
slide
8 April 2024

New Study Reveals: Children Left Alone To Deal with Online Dangers

Press Release - New research by ECPAT International, Eurochild, and Terre des Hommes Netherlands reveals an inconvenient truth: children often rely on their instincts to navigate the digital world due…
read more