The European Union has officially decided that scrolling through short videos is a public health crisis worthy of government intervention. On February 6, 2026, the European Commission announced preliminary findings that TikTok’s core design features — its infinite scroll and personalized recommendation algorithm — violate the Digital Services Act, marking what may be the most brazen regulatory overreach in the history of social media governance. The accusation: that TikTok’s app is too engaging, too personalized, and too good at keeping people entertained, especially children. In Brussels, apparently, building a successful product is now a punishable offense.

According to The New York Times, EU regulators concluded that TikTok’s design leads to “compulsive” behavior among users, with particular concern directed at minors. The Commission’s preliminary findings assert that the platform’s autoplay function, its endless feed, and its algorithmically curated content recommendations create what officials describe as an “addictive design” that fails to adequately protect young users. If the findings are confirmed, TikTok could face fines of up to 6% of its global annual revenue — a staggering sum that would amount to billions of dollars and send shockwaves through every technology company operating on the continent.

The DSA’s Expanding Tentacles: From Content Moderation to Product Design

The Digital Services Act, which came into full effect in 2024, was sold to the European public as a necessary tool to combat illegal content, disinformation, and online harms. But the TikTok case reveals the DSA’s true ambitions: it is not merely a content moderation framework but a sweeping regulatory apparatus that empowers unelected bureaucrats in Brussels to dictate how technology products are designed, how algorithms function, and ultimately, how information is distributed to hundreds of millions of people. The fact that the EU is now targeting the fundamental mechanics of a social media feed — the scroll itself — suggests there is virtually no aspect of digital product design that falls outside the Commission’s self-appointed jurisdiction.

As reported by TechCrunch, EU regulators are not simply asking TikTok to add warning labels or age-verification gates. They are demanding that the company fundamentally disable or restructure features that are central to its product experience — including the recommendation engine that has made TikTok the most popular social media platform in the world. The Commission wants TikTok to offer users, particularly minors, a version of the app stripped of personalized recommendations and infinite scrolling. In other words, Brussels wants TikTok without the things that make TikTok what it is. One might as well ask a restaurant to serve food without flavor and then wonder why nobody shows up.

Defining ‘Addiction’ Down: When Engagement Becomes a Crime

The intellectual foundation of the EU’s case rests on the premise that “addictive design” is a meaningful regulatory category — a premise that deserves far more scrutiny than it has received. The Commission’s language borrows heavily from the vocabulary of substance abuse and behavioral psychology, describing TikTok’s features as producing “compulsive” usage patterns. But there is a vast and important difference between a chemical dependency on opioids and a teenager’s preference for watching dance videos over doing homework. By conflating genuine addiction with high user engagement, the EU is constructing a regulatory framework so elastic that it could theoretically be applied to any product that people enjoy using frequently — from video games to streaming services to the humble page-turner novel.

The Commission’s findings specifically target TikTok’s recommendation algorithm, which uses machine learning to surface content tailored to individual users’ interests and viewing habits. This is the same basic technology that powers Netflix’s movie suggestions, Spotify’s Discover Weekly playlists, and Amazon’s product recommendations. If an algorithm that learns user preferences and serves relevant content is inherently “addictive” and therefore illegal, then the EU has effectively declared war on personalization itself — one of the foundational innovations of the modern internet. The implications extend far beyond TikTok. Every major platform that uses algorithmic curation is now on notice that Brussels considers their core technology suspect.

The Mommy State Knows Best: Parental Responsibility vs. Government Paternalism

Perhaps the most troubling aspect of the EU’s action is what it reveals about the Commission’s view of its own citizens. The framing of the TikTok case is built entirely around the protection of children, a rhetorical strategy as old as censorship itself. No reasonable person disputes that children deserve protections online. But the EU’s approach bypasses parents entirely, substituting the judgment of distant regulators for the decisions of families. Rather than empowering parents with tools and information to manage their children’s screen time — something TikTok already offers through its Family Pairing feature and screen time management settings — Brussels has decided that it, and it alone, is qualified to determine what constitutes an appropriate digital experience for European youth.

This is the essence of what critics have rightly called the “Mommy State” — a governing philosophy in which the state assumes the role of an overprotective parent, shielding citizens from choices that bureaucrats have deemed too dangerous for ordinary people to make on their own. The EU’s approach to TikTok is of a piece with its broader regulatory philosophy, which treats European citizens not as autonomous adults capable of managing their own media consumption but as helpless victims in need of rescue by enlightened technocrats. It is a philosophy that is fundamentally incompatible with individual liberty and personal responsibility, and it is one that the European Commission has embraced with an enthusiasm that would make the most dedicated helicopter parent blush.

Free Speech in the Crosshairs: The DSA’s Chilling Effect

The free speech implications of the EU’s action against TikTok cannot be overstated, though they have been largely ignored in the Commission’s self-congratulatory press releases. When a government dictates how a recommendation algorithm must function, it is directly controlling what content users see and, by extension, what ideas they are exposed to. An algorithm is, at its core, an editorial function — it determines which voices are amplified and which are suppressed. By asserting the authority to redesign TikTok’s recommendation engine, the EU is claiming the power to shape the information diet of hundreds of millions of people. This is not consumer protection; it is information control dressed up in the language of public health.

The DSA’s broader track record on speech issues only deepens these concerns. The regulation has already been used to pressure platforms to remove content that EU officials deem to be “disinformation” — a category so vaguely defined that it effectively grants Brussels veto power over political speech that contradicts the Commission’s preferred narratives. As noted by the U.S. House Judiciary Committee’s GOP members on X, the DSA represents a framework that would be plainly unconstitutional under the First Amendment, and its application to American technology companies raises profound questions about the extraterritorial reach of European speech regulations. The Committee has been vocal in its criticism of European regulatory frameworks that it views as incompatible with American free speech principles, and the TikTok case provides fresh ammunition for that argument.

A Pattern of Regulatory Imperialism: Europe’s Tech Shakedown

The EU’s action against TikTok must be understood in the context of a broader pattern of European regulatory aggression toward American and international technology companies. Over the past decade, Brussels has levied billions of euros in fines against Google, Apple, Meta, Amazon, and Microsoft, often on legal theories that would be laughed out of an American courtroom. The GDPR, the Digital Markets Act, the AI Act, and now the DSA represent an ever-expanding web of regulations that have made Europe the most hostile regulatory environment for technology innovation in the developed world. It is no coincidence that Europe has failed to produce a single globally competitive technology platform of its own — its regulatory apparatus is designed not to foster innovation but to extract rents from companies that had the audacity to innovate elsewhere.

As Cryptopolitan reported, the Commission’s demands go beyond mere compliance adjustments. European regulators are effectively telling TikTok to redesign its product from the ground up, stripping away the features that have driven its extraordinary global success. The message to the technology industry is unmistakable: if you build something that hundreds of millions of people love, Europe will find a way to regulate it into mediocrity. This is not a regulatory philosophy that encourages investment, entrepreneurship, or risk-taking. It is a philosophy that punishes success and rewards bureaucratic empire-building.

The Hypocrisy of Selective Outrage: Why TikTok, Why Now?

There is also a troubling selectivity to the EU’s enforcement actions that suggests the TikTok case is driven as much by geopolitics as by genuine concern for user welfare. TikTok, owned by the Chinese company ByteDance, has been a target of Western governments for years over data security concerns. While those concerns have some legitimate basis, the “addictive design” case has nothing to do with data security and everything to do with the EU’s desire to demonstrate regulatory muscle. If infinite scroll and personalized recommendations are truly dangerous, then Instagram’s Reels, YouTube’s Shorts, and Snapchat’s Spotlight are equally culpable. Yet the Commission has not announced parallel investigations into those platforms with anything approaching the same urgency or fanfare.

This selective enforcement undermines the credibility of the EU’s stated rationale and raises legitimate questions about whether the DSA is being applied as a neutral, principled regulation or as a political weapon wielded against disfavored companies. The Commission’s defenders will argue that TikTok was designated as a “Very Large Online Platform” under the DSA and is therefore subject to heightened scrutiny. But this designation itself is a product of bureaucratic discretion, and the decision to make TikTok the test case for the DSA’s most aggressive provisions looks less like principled regulation and more like a strategic choice to pick a target with few political allies in Europe.

What TikTok Actually Does to Protect Young Users

Lost in the Commission’s rhetoric is any serious acknowledgment of the steps TikTok has already taken to address concerns about young users. The platform has implemented a 60-minute daily screen time limit for users under 18, which requires a passcode to bypass. It has disabled push notifications for minors during late-night hours. It has restricted direct messaging for users under 16. It offers a “Family Pairing” feature that allows parents to link their accounts to their children’s and control settings including screen time limits, restricted mode, and who can send messages. These are not trivial measures — they represent a more comprehensive suite of parental controls than most social media platforms offer.

But for the European Commission, these measures are insufficient because they rely on a model of shared responsibility between the platform and parents. The EU’s preferred model is one in which the government dictates product design and parents are irrelevant — a model that treats the family as an obstacle to be circumvented rather than an institution to be supported. TikTok, in its response to the preliminary findings, stated that it disagrees with the Commission’s conclusions and intends to exercise its right of defense. The company emphasized that its existing protections for minors go beyond what most competitors offer and argued that the Commission’s interpretation of “addictive design” lacks a clear legal or scientific foundation.

The Broader Threat to Innovation and Digital Freedom

The ramifications of the EU’s preliminary findings extend far beyond TikTok and far beyond Europe. If the Commission’s theory of the case is upheld — that a recommendation algorithm and an infinite scroll constitute illegal “addictive design” — it will establish a precedent that could reshape the entire global technology industry. Every social media platform, every streaming service, every news aggregator, and every e-commerce site that uses personalization technology will face the prospect of European regulators second-guessing their product decisions. The compliance costs alone will be staggering, but the real damage will be to innovation. Companies will be forced to design products not to delight users but to satisfy regulators, a recipe for mediocrity that Europe has already perfected in other industries.

The United States, for all its own regulatory dysfunction, has thus far resisted the temptation to regulate social media design at the federal level. The First Amendment provides a constitutional bulwark against the kind of government-directed content curation that the EU is now demanding, and American courts have consistently held that algorithmic recommendations are protected editorial judgments. But the EU’s regulatory model has a way of metastasizing — the GDPR’s influence on global privacy practices is a case in point — and there is a real risk that European standards will become de facto global standards as companies find it easier to implement a single, EU-compliant design rather than maintaining separate versions for different markets.

A Continental Philosophy of Control Masquerading as Protection

What the EU’s crusade against TikTok ultimately reveals is not a continent concerned about children’s welfare but a governing class that has become addicted to its own power. The Digital Services Act is the latest and most potent expression of a European regulatory philosophy that views individual choice with suspicion, treats market success as evidence of wrongdoing, and believes that complex social problems can be solved by adding more rules. It is a philosophy that has more in common with command-and-control economics than with the liberal democratic traditions that Europe claims to champion.

The irony is almost too perfect: the EU accuses TikTok of creating an addictive product while Brussels itself appears unable to stop its compulsive regulation of the technology sector. Each new enforcement action begets demands for more enforcement, each new rule creates the justification for additional rules, and the regulatory apparatus grows ever larger and more intrusive. The Commission’s infinite scroll through the technology industry’s product features shows no sign of stopping, and there is no algorithm to recommend restraint. European citizens — and the global technology companies that serve them — deserve better than a governing body that confuses control with care and regulation with responsibility. The TikTok case is not about protecting children. It is about power, and Brussels has made clear that it intends to accumulate as much of it as possible, one banned feature at a time.