EU targets TikTok, Instagram over addictive design as bloc considers social media ban for children

Reviewed byNidhi Govil

4 Sources

Share

The European Commission is preparing new regulations to protect children online by targeting addictive features like endless scrolling and autoplay on platforms including TikTok, Instagram, and Facebook. EU chief Ursula von der Leyen announced the bloc could propose a social media ban for children as early as this summer, backed by an EU-developed age verification app.

EU Takes Aim at Social Media Platforms Harming Children

The European Union is intensifying efforts to protect children online through sweeping new EU social media regulation targeting platforms like TikTok, Meta's Instagram and Facebook, and X

1

. Speaking at the European Summit on Artificial Intelligence and Children in Copenhagen on Tuesday, European Commission President Ursula von der Leyen announced that Brussels is preparing to crack down on what she described as harmful design practices that exploit young users

2

. The Commission chief emphasized that the risks children face online—including sleep deprivation, depression, anxiety, self-harm, and cyberbullying—are not accidental but stem from business models that treat children's attention as a commodity

4

.

Source: Euronews

Source: Euronews

Targeting Addictive Design Features Across Major Platforms

The EU is specifically taking action against addictive design elements including endless scrolling, autoplay, and push notifications that keep young users engaged for extended periods

1

. Von der Leyen stated that the Commission is investigating TikTok for these features while also pursuing Meta because Instagram and Facebook are believed to be failing to enforce their own minimum age requirement of 13

2

. The European Commission has also launched proceedings against X for its Grok AI tool, which has been used to create sexually explicit non-consensual content of women and children

4

. These investigations are being conducted under the Digital Services Act, which requires large platforms to take stronger action against illegal and harmful content.

Source: Reuters

Source: Reuters

Social Media Ban for Children Could Arrive This Summer

In a significant development, Von der Leyen revealed that the EU could propose a bloc-wide social media ban for children as early as this summer

3

. The Commission has established an independent expert panel on online child safety to assess possible measures aimed at addressing addiction and mental health issues among minors. "Without pre-empting the panel's findings, I believe we must consider a social media delay. Depending on the results, we could come forward with a legal proposal this summer," she told delegates

3

. This timeline would allow Brussels to move ahead of French legislation expected in September, which will require platforms to block users under 15 and suspend existing accounts. Support for stricter controls has grown across the bloc, with France, Spain, Greece, and Denmark leading calls for stronger measures to protect children online.

Age Verification Technology and Digital Fairness Act

Addressing one of the main technical challenges, the EU has developed its own age verification app with what Von der Leyen described as "the highest privacy standards in the world"

2

. Modeled on the EU Digital COVID Certificate system, the app will soon be available for member states to integrate into their digital wallets, making enforcement by online platforms more straightforward

3

. "No more excuses - the technology for age-verification is available," Von der Leyen stated. Later this year, the Commission will propose the Digital Fairness Act, which will target harmful design practices including attention capture, complex contracts, and subscription traps while setting strict limits on artificial intelligence use in social media

4

.

What This Means for the Future of Online Child Safety

The EU's approach reflects a fundamental shift in how regulators view the relationship between social media platforms and young users. "The question is not whether young people should have access to social media, the question is whether social media should have access to young people," Von der Leyen said

1

. The Commission is also investigating platforms that allow children to go down "rabbit holes" of harmful content, such as videos promoting eating disorders or self-harm

2

. While several EU governments have reacted cautiously to the age verification app and cybersecurity experts have raised concerns over potential technical vulnerabilities, the momentum for action appears strong. The EU is not alone in this effort—Australia and Indonesia have already introduced similar measures. Von der Leyen warned that hesitation would mean "another entire generation of children that pays the price"

3

. As the Digital Fairness Act takes shape and the expert panel delivers its findings, social media companies face mounting pressure to fundamentally redesign their platforms or risk significant regulatory action across Europe's single market.

Today's Top Stories

TheOutpost.ai

Don’t drown in AI news. We cut through the noise - filtering, ranking and summarizing the most important AI news, breakthroughs and research daily. Spend less time searching for the latest in AI and get straight to action.

Instagram logo
LinkedIn logo
Youtube logo
© 2026 TheOutpost.AI All rights reserved