TL;DR
The EU announced plans to crack down on TikTok and Instagram’s addictive features targeting children, including endless scrolling and autoplay. This move aims to enhance child safety online amid ongoing regulatory efforts worldwide.
The European Union has announced plans to regulate ‘addictive design’ features on TikTok and Instagram, targeting practices such as endless scrolling, autoplay, and push notifications to protect children from harm. This move signals a significant step in the EU’s ongoing efforts to safeguard minors online and hold social media platforms accountable.
European Commission President Ursula von der Leyen stated Tuesday at the European Summit on Artificial Intelligence and Children that the EU is taking action against TikTok and Meta (owner of Instagram and Facebook) for their design features that encourage prolonged usage among children. The EU’s approach includes investigating platforms that enable children to encounter harmful content, such as videos promoting eating disorders or self-harm.
As part of this initiative, the EU has developed its own age verification app, which adheres to the highest privacy standards globally. Von der Leyen emphasized that the technology for effective age verification exists and will be integrated into digital wallets used by member states, making enforcement feasible. The EU aims to have a legislative proposal ready by summer, pending advice from its ‘Special Panel of experts on Child Safety Online.’
Why It Matters
This development is significant because it represents a coordinated effort by the EU to regulate social media’s impact on children, addressing concerns about addiction and exposure to harmful content. It also signals a potential shift in global social media regulation, emphasizing stricter controls and accountability for platforms targeting minors.
The move comes amid broader international efforts to limit children’s exposure to addictive and harmful online content, including recent bans and proposed legislation in countries like Australia, Spain, France, and the UK. If successful, the EU’s measures could influence global standards and compel platforms to redesign features for safer use by minors.
child-safe screen time management device
As an affiliate, we earn on qualifying purchases.
As an affiliate, we earn on qualifying purchases.
Background
The EU has been increasingly active in regulating Big Tech over the past year, including fines against companies like Apple, Meta, and Google for antitrust violations. The focus on child safety follows recent U.S. court rulings that found design features such as infinite scrolling and autoplay contribute to addiction and mental health issues in teenagers. The EU also recently found Meta breached its Digital Services Act by failing to prevent minors from accessing its platforms, with minors easily bypassing age checks.
Globally, governments are considering or implementing stricter social media regulations, with Australia leading a social media ban for under-16s enacted in December. The EU’s upcoming legislation aims to address these concerns comprehensively, focusing on both design practices and age verification.
“We are taking action against TikTok and its addictive design – endless scrolling, autoplay, and push notifications. The same applies to Meta, because we believe Instagram and Facebook are failing to enforce their own minimum age of 13.”
— Ursula von der Leyen
“No more excuses – the technology for age-verification is available.”
— Ursula von der Leyen
age verification app for kids
As an affiliate, we earn on qualifying purchases.
As an affiliate, we earn on qualifying purchases.
What Remains Unclear
It remains unclear exactly what specific legislative measures will be enacted and how platforms will be required to implement changes. Details on the enforcement timeline and potential penalties are still emerging. Additionally, the response from TikTok, Meta, and other platforms has not yet been publicly disclosed.
parental control device for social media
As an affiliate, we earn on qualifying purchases.
As an affiliate, we earn on qualifying purchases.
What’s Next
The EU plans to finalize its legislative proposal by summer 2026, after consulting its expert panel. Once introduced, the legislation will undergo further review and debate within EU institutions. Platforms will then have a defined period to comply with new rules, with enforcement measures likely to follow.
kids' device with restricted internet access
As an affiliate, we earn on qualifying purchases.
As an affiliate, we earn on qualifying purchases.
Key Questions
What specific features will the EU target on TikTok and Instagram?
The EU is focusing on features such as endless scrolling, autoplay, push notifications, and other design elements that encourage prolonged usage among children.
How will the EU verify the age of social media users?
The EU has developed an age verification app that adheres to the highest privacy standards, which member states can integrate into their digital wallets to enforce age restrictions.
When will the new regulations take effect?
The EU aims to introduce a legislative proposal by summer 2026, with enforcement measures likely to follow after legislative approval and platform compliance periods.
How might these regulations impact social media companies?
Platforms may need to redesign features to reduce addictive elements and implement more robust age verification systems, potentially affecting their user engagement strategies and revenue models.
Why is this regulation important now?
Concerns over mental health, addiction, and exposure to harmful content among minors have increased, prompting regulators worldwide to seek stronger protections for children online.