Children are still left in a Digital Wild West
Text by Mie Oehlenschläger, first published at Dataethics.eu
It took more than a decade for much of the political establishment and the general public to realize that the digital environment is not designed for children. We are working on cleaning up – but ‘old’ social media problems are overshadowed the challenges of generative Ai.
The repercussions of having legitimized a childhood dominated by unregulated online communities and information ecosystems based on either complete anonymity or surveillance capitalism are hard to document. And for years, commercial digitalization was understood by the majority as the next natural step on a Darwinian ladder naturally leading humanity towards a better society. Surely, tech will fix it, many said. And adults who didn’t understand that “this is just the new reality” not only violated the child’s right to participate in the new reality – they actually inflicted screen shame on children.
But fortunately, things tend to change. Not on their own – but partly as passionate people tirelessly point out the problems. And while politicians, civil society and others come to their senses and start facing the problems, tech companies continue their unethical practices, lobbying, soft power and general abdication of responsibility.
This is one of the reasons why a change towards a world where children are actually being better protected in the digital environment is so slow moving.
Yet, there are interesting things happening that are moving in the right direction. I will mention three of them here, which are happening at a large sale, but are important to know about.
1) The EU is working hard on the enforcement of the Digital Services Act. Right now, the Commission is asking civil society, researchers and others for feedback on the ‘Guidelines for the protection of minors online under the Digital Services Act’ – that’s Article 28 of the DSA. These guidelines “aim to support platforms accessible to minors in ensuring a high level of privacy and security for children, as required by the DSA”.
What’s interesting here is that it all comes down to design. For example, platforms should
Implement age safeguards that reduce the risk of children being exposed to pornography or other age inappropriate content.
Set children’s accounts as private by default, reducing the risk of unsolicited contact from strangers.
Adjust their recommendation systems and prioritize explicit signals from users about whether they like or dislike the content they are viewing, reducing the risk of children falling down rabbit holes of harmful content.
Enable children (and their parents) to block and mute any user and ensure they cannot be added to groups without their explicit consent, which can help reduce the risk of cyberbullying.
You can read more here and give your feedback until June 10th.
2) Also in the EU and in parallel to the implementation of the DSA, the Commission is working on an age verification app that will serve as an interim solution until the EU digital identity wallet becomes available at the end of 2026. The app is based on the same technology as the EU Wallet and will allow online service providers to check whether users are 18 years or older without compromising their privacy. The goal of the project is to develop an EU-harmonized age verification solution that protects privacy, including a white-label open source app, by summer 2025. The first version of the technical specifications and the beta version are already available on GitHub.
3) In April, after years of working with 5Rights and others (where I am President of the Board of Directors), Indonesia passed governmental regulation on child protection inspired by the UK’s Age Appropriate Design Code. 5Rights describes it as a landmark move:
“5Rights welcomes this new law that protects the nation’s 84 million children online and promotes fair global standards of technological accountability.”
The law is being developed in collaboration with standards organization IEEE, who will provide strategic expertise through their work on design standards. They say: “This regulation is the first of its kind in Asia and the Global South and sets enforceable requirements for digital platforms to protect children’s privacy, safety and well-being as a robust, holistic approach to fighting children’s online addiction. Indonesia’s adoption of this landmark regulation is an important step forward for online child protection on a global scale.” https://standards.ieee.org/news/indonesia-age-appropriate-design/ Increasingly, work on online child protection is also about standards. IEEE has been a strong advocate for greater security. You can read more about IEEE’s work here.
GenAI Fly Under The Radar
While the “old part of the internet” such as social media, chat platforms and online games are becoming subject to more regulation, generative AI for children is still flying under the radar. In a new study, the Turing Institute calls for more focus on “children’s use of AI” and “a child centered approach” to AI. 5Rights has launched an AI Children and Design Code due to the long list of inherent direct risks associated with generative AI (also for) children.
It’s very important to focus on tech as a fundamental issue of design.
But regardless of design and underlying business models, someone always owns and controls the software, and unless their power is kept in a very short leash, we face a collective civilizational challenge of the first order when AI enters the innermost spheres of existence. And the question is whether the nursery should be protected in the first place.