Protecting children’s rights online is a strategic choice
Blog by Fabiola Bas Palomares, Lead Policy & Advocacy Officer on Online Safety on how increasing online child protection is both a political and an economic strategic choice that must be prioritised in the European Union.
Globally, it has been estimated that 1 in 3 internet users is a child. With the age of going online for the first time rapidly decreasing this statistic is bound to be higher. While it would be easy to assume that the safeguards children have in the offline realm have been translated to the digital environment, the reality is far from this.
The UN Convention on the Rights of the Child (UNCRC) states that children must be protected from all forms of abuse. However, reports of online child sexual abuse keep increasing, especially of younger children aged 7 - 10, with 60% of the material found being hosted in the EU. The inter-institutional negotiations around the EU legislative proposal to combat child sexual abuse risk excluding the possibility of companies to detect and remove online grooming on their services. Similarly, children’s right to privacy is severely undermined by the threat of child sexual abuse and by abusive personal data collection practices. Children’s best interests, one of the core principles of the UNCRC, are often secondary to the use of algorithms and dark patterns that manipulate children for the commercial requirement for data.
Changing this reality takes a multi-stakeholder approach. Digital literacy and skills are an integral part of online child safety and a good catalyst to enable the realisation of children’s rights online. However, as highlighted by Eurochild members in Paving the way to realise children’s rights online in Europe, digital upskilling programmes must be complemented with robust legislation that holds online platforms accountable for ensuring the rights of all their users and caters for the special vulnerabilities of children.
This past year, Eurochild has been advocating for better EU policy in this direction, with a special focus on the severe threat of child sexual abuse. The EU and its Member States are bound to the UNCRC, as part of the EU acquis. Under the guidance provided by General Comment 25 on children’s rights in the digital environment, the EU should advance in the successful implementation of the Digital Services Act and guarantee strong legislative frameworks to combat online harm and online risks, most notably cyberbullying and all forms of child sexual abuse.
However, additional policy actions are urgently needed. The unresolved issue of age assurance must be addressed with the aim of fostering safe and age-appropriate experiences for children online. Along these lines, intentionally persuasive digital designs that manipulate children’s behaviours (i.e., creating addictions or luring them into unexpected purchases) must be properly regulated. In addition, the effects of digital technologies and online harm on children’s mental health must be recognised and enshrined in EU policy.
Through a dedicated task force, we have built the capacity of Eurochild members to advocate for strong digital policy for children, while leveraging their expertise to feed into EU policies, for example by collecting good practices and recommendations. Finally, we have been listening to children’s views and needs, paying special attention to protection from child sexual abuse. Together with ECPAT and Terre des Hommes, we have consulted almost 500 children through the VOICE project on online safety and protection from child sexual abuse. Stay tuned for the final results coming in April.
The reality is that children are growing up in an environment that was not designed with their needs and vulnerabilities in mind. Similarly to how we design parks in a way that is safe, inclusive and child-friendly, online platforms must be designed to fulfil children’s rights and needs. This past year, I have learnt that increasing online child protection is both a political and an economic choice. Firstly, because policy-makers must hold online platforms accountable for the products and services they design. Secondly, because online platforms must start embedding children’s rights within their business models by making design choices that do not harm or limit them. What is certain is that the current lack of boundaries and accountability in the digital environment is leaving little choice to children themselves.