EU ministers pledged to enhance online child protection but rejected Denmark’s proposal to ban social media use for children under 15.
During the informal meeting of national telecom ministers on Friday, 25 member states, excluding Estonia and Belgium, signed the Danish Jutland Declaration, which was drafted by the Nordic nation as it chairs EU ministers' meetings during the second half of 2025.
The ministers stated that as kids "require stronger and more targeted protection" and are more likely to be exposed to harmful, unlawful, and extremist information, they will investigate if further measures are necessary to supplement current EU regulations.
According to a WHO report from 2024, the percentage of teenagers who use social media problematically rose from 7% in 2018 to 11% in 2022. According to the study, 11% of teenagers reported having trouble controlling their use and suffering from negative effects.
One tool to mitigate the negative impact of illegal and age-inappropriate content is effective age verification. The upcoming Digital Fairness Act, which the Commission will present rules for early next year, could incorporate these tools.
“In the offline world, age checks are standard for age-restricted goods and services. So, it is reasonable to expect similar safeguards online, where the risks — especially for minors — are significant and well-documented," the declaration said.
"Without proper and trustworthy age verification, it is difficult to prevent for example social media from targeting minors with content and features designed for adults, putting their wellbeing at risk,” it added.
During the informal meeting of national telecom ministers on Friday, 25 member states, excluding Estonia and Belgium, signed the Danish Jutland Declaration, which was drafted by the Nordic nation as it chairs EU ministers' meetings during the second half of 2025.
The ministers stated that as kids "require stronger and more targeted protection" and are more likely to be exposed to harmful, unlawful, and extremist information, they will investigate if further measures are necessary to supplement current EU regulations.
According to a WHO report from 2024, the percentage of teenagers who use social media problematically rose from 7% in 2018 to 11% in 2022. According to the study, 11% of teenagers reported having trouble controlling their use and suffering from negative effects.
To "better protect children and young people in a digital reality where many experience declining well-being and increasing addiction to social media," Danish Prime Minister Mette Frederiksen said earlier this week that the nation would impose a ban on social media for children under the age of 15.
94% of Danish youngsters used social media before the age of 13, despite the EU's requirement that children be at least 13 to have an account.
Last month, Ursula von der Leyen, the president of the European Commission, announced that the EU executive has established an expert panel to provide recommendations on how to limit children's access to social media in the EU.
As part of its efforts under the DSA, the Commission also asked web companies Google, Apple, Snapchat, and YouTube to furnish additional details about their age verification methods on Friday.
The Commission stated that it is interested in learning how they keep children away from hazardous materials, including content that encourages eating disorders, and illicit products, such as narcotics or vapes.
Since the DSA went into effect in 2023, the Commission has launched eleven investigations into platforms, including X, TikTok, and Meta's Facebook and Instagram, for alleged violations ranging from recommender systems to election integrity.
None of the investigations have been concluded yet, and the proceedings are still under progress.
What measures did ministers agree to explore next?
Creating and employing improved age verification systems to better control access to inappropriate content.
Fostering collaboration and cooperation among member states to enforce existing digital safety legislation and expedite new regulation.
Increasing transparency obligations for online platforms regarding how they engage in moderation and protect children. Promoting digital literacy and empowering minors and their families with better control of their online experience.
Advocating for tech companies to design and launch child-friendly services of better access to harmful and extremist content. Adopting a holistic model to leverage public-private partnerships for greater investment in safe digital infrastructure and innovation.
