HORSENS, Denmark – EU digital ministers are gathering in Denmark to sign a declaration on Friday that’s focused on a single goal: protecting children online.
A draft of the declaration, known as the Jutland Declaration – seen by Euractiv last week – had the Danish presidency stressing the “exceptional” need to safeguard children in the digital space but remained vague on details.
Multiple EU countries have, in recent months, signalled support for tougher measures, but key policy questions will need to be answered if the online safeguarding ambition is to deliver better conditions for kids.
EXCLUSIVE: Danish presidency wants EU to play bigger role in online child safeguarding
The Danish Presidency wants EU governments and the Commission to take further measures to protect…
2 minutes
Age limits: National or EU choice?
The draft declaration stopped short of calling for an EU-wide social media age limit – the age below which children would be barred from creating accounts on platforms like Instagram or TikTok.
European Commission President Ursula von der Leyen called the issue a top priority in her recent State of the Union address, eyeing the possibility of EU-wide rules to curb children’s use of social media services, which she characterised as profit-seeking and harmfully addictive.
But Brussels faces a legal roadblock as the Digital Services Act (DSA), the EU’s online governance rulebook, doesn’t provide a mechanism to set age restrictions. Commission spokesperson Thomas Regnier acknowledges the limitation, confirming that the DSA only allows the Commission to issue guidelines on how platforms can protect minors. He told reporters last month that the DSA “is not the legal basis” that could set social media age limits.
The Commission’s DSA guidelines leave it up to national governments to define their own so-called “digital age of majority” – the age at which a child can use digital services without parental consent.
That’s not entirely new. Under the bloc’s privacy rulebook, the General Data Protection Regulation (GDPR), EU countries already have the power to set their own digital age thresholds for children’s information to be processed. Below which, parental consent is required.
Denmark, spurred on by the Commission’s DSA guidelines, announced on Tuesday that it would ban several social media websites for children under 15-years-old.
Whether other EU countries will rally behind the Danish approach – or push for a common EU-wide age limit – remains to be seen.
Who bears responsibility – parents or platforms?
A deeper rift lurks beneath the surface: Who bears responsibility for protecting children online – parents or platforms?
Von der Leyen called out social media algorithms for prying on children’s vulnerabilities last month. The Danish Prime Minister Mette Frederiksen and her Digital Minister Caroline Stage Olsen hold similar views, with Olsen telling Euractiv in an interview this summer that big tech is “data harvesting from minors”.
The draft of the Jutland Declaration was vague on the role and responsibilities of tech companies, merely noting a “need to explore” whether new measures are required in addition to the DSA.
For now, the tech industry appears to be setting its own standards. US giant Meta, which owns Facebook and Instagram, launched a public campaign backing the idea of a common EU digital age of majority in a bid to shape the debate before Brussels acts.
The draft declaration also highlights the role of parents, calling for them to be “involved” by being able to access tools like parental control software. But it draws a careful line, warning that responsibility should not be “transferred” to parents.
The tension – between giving parents controls but not saddling them with the expectation to police everything kids do online – remains a major component of the debate.
(nl, vib)