The EU formally linked "addictive design" on social media to "profit-driven" harm. This directly indicts the core engagement models that power platforms like TikTok and Instagram. Expect a battle over mandatory screen-time breaks and recommender system overhauls.
How We Got Here
The Digital Fairness Act follows the earlier Digital Services Act (DSA), under which the EU has already been investigating major platforms. On February 6, 2026, the Commission issued preliminary findings that TikTok breached the DSA, specifically citing harmful design.
The Numbers
- The Digital Fairness Act specifically targets features like endless scrolling, autoplay, and push notifications.
- EU Commission President Ursula von der Leyen announced the DFA at the European Summit on AI and Children in Copenhagen.
- Active DSA investigations are already underway against TikTok, Meta, and X, with Instagram and Facebook cited for not enforcing their age 13 minimum.
- X faces scrutiny over its Grok AI tool, linked to generating sexual images of women and children.
- Preliminary findings from February 6, 2026, require TikTok to fundamentally redesign its service, including mandatory screen-time breaks and recommender system changes.
What Happens Next
🇮🇳 Why This Matters for India
For Bangalore product managers and Mumbai investors, this signals a future where engagement-at-all-costs models will face increasing regulatory headwinds, potentially forcing a rethink on design.
The Take
This signals a clear shift: ethical engagement metrics will increasingly override raw time-on-app as the primary north star. Indian consumer app founders should start stress-testing their growth loops for similar regulatory scrutiny.
Source:
MediaNama ↗