news
slide

Age assurance: a silver bullet?

Age assurance has been in the spotlight for some years now, often presented as a key step to implement certain legal instruments regarding the protection of children online, i.e. the Digital Services Act (DSA). While this might be true, it should not be seen as a miraculous solution, but as one of the measures in the wider toolbox needed to promote safer experiences for children online.

Age assurance comprises all mechanisms designed to estimate or verify the age of a user (be it a range of years – i.e., 13-18 – or a threshold – i.e., below or over 18), using a wide variety of methods - self-declaration, credit cards or ID documents, facial age estimation, behavioural profiling, etc. According to ECPAT International, Eurochild and Terre des Hommes Netherlands’ latest research, almost 60% of children supported online platforms to verify their age.

In a moment where digital policy debates are becoming unnecessarily technical and complicated, age assurance shall not become an excuse to gate keep children from most digital spaces. While necessary to implement the DSA, age assurance shall not be implemented in isolation: it must be regarded as a solution that sits within a wider range of measures preventing harm and empowering children online.  

Despite the technical challenges that may exist, age assurance mechanisms must be child-rights respecting, paying specific attention to their effect on participatory and privacy rights, while remaining effective and realistic.

Firstly, to ensure age assurance tools respect the full range of children’s rights, it is key to ensure they are proportional to the risk and it is designed and operationalised in the best interests of the child. A recent study commissioned by the European Commission sets out 10 requirements that need to be balanced when determining the method of age assurance to be implemented, such as proportionality, privacy, accuracy, non-discrimination, participation, transparency, among others. However, an exercise of balancing is needed. Proportionality considerations may help determine the balance between accuracy and privacy (i.e., a kid’s platform may not need the same level of accuracy and hence privacy-intrusion than one with adult content). Similarly, assessing children’s best interests may shed some light resolving conflicts between participation and safety (i.e., safety needs may justify children’s restricted access to adult content, hence limiting their participation).

Secondly, it is worth noting that the primary responsibility for implementing appropriate age assurance lies with the service provider. As part of the duty of care that online platforms have with regard to their child users, age assurance should be a legal imperative for services likely to be accessed by children. However, this does not mean that platforms should be the ones processing the data needed to verify the users age, especially given the susceptibility of children’s data and their potential use for commercial purposes. Similarly, it does not mean it should be online platforms developing these technologies, due to their incentive to minimise the costs of such an expensive technology.

The Spanish Data Protection Authority (AEPD) has become pioneer in this space with the development of an age verification prototype – an app that securely scans an official identity document and generates a digital token provided to the site requesting the verification, which does not contain other information than the age requirement is fulfilled by the user. It is becoming increasingly relevant to ensure age assurance solutions are standardised and homogenised at European level to avoid asymmetries across Member States.

In this complex environment, Eurochild highlights the need for the harmonization of the age for children’s consent across the EU and provision of harmonised GDPR guidelines for children’s data. More importantly, to ensure age assurance solutions are  child-rights respecting and effective, there is an urgent need for technical standards that lay down minimum principles of data minimization, privacy preservation, data security, accessibility and respect the best interests of the child.

For more information, please contact Fabiola Bas Palomares, Lead Policy & Advocacy Officer on Online Safety.




Related News/Events

slide
18 November 2024

Europe’s hidden crisis: Child sexual abuse online at unprecedented levels

On this World Day for the Prevention and Healing from Child Sexual Exploitation, Abuse and Violence, a coalition of 77 organisations working on children right’s, urgently call on policy-makers to…
read more
slide
8 November 2024

Discussing Eurochild's positioning on child rights online with members

On 26 November at 3 pm CEST we are holding an 1h30 online discussion with our members on Eurochild's positioning on the area of children's rights online. As we are…
read more
slide
23 September 2024

A Child Rights Manifesto for advertisers

As innovation affect children’s lives and the exercise of their rights in complex and wide-ranging ways, it is key that all actors involved in the process contribute to uphold the…
read more