The European Union (EU) has fined Elon Musk’s X (formerly Twitter) $140 million for violating one of the bloc’s key social media laws. The case is being seen as a key test of European officials’ ability to take on American tech giants, even though such actions could irk the Trump administration.

The EU said that it issued the fine to X for breaching its transparency obligations under the Digital Services Act (DSA), the bloc’s main social media and e-commerce law. The breaches include the deceptive design of its ‘blue checkmark’, the lack of transparency of its advertising repository, and the failure to provide access to public data for researchers. This is the first non-compliance decision under the DSA.

The fine has drawn a political storm in the United States. Ahead of the decision, US Vice President JD Vance said on X: “Rumors swirling that the EU commission will fine X hundreds of millions of dollars for not engaging in censorship. The EU should be supporting free speech, not attacking American companies over garbage.” Musk has called for the European Union to be “abolished”.

US Secretary of State Marco Rubio said the fine was an “attack on all American tech platforms and the American people by foreign governments”. Brendan Carr, the US Federal Communications Commission Chairman said that X was fined merely for being a successful US tech company.

What X’s violations were

The EU said that X’s use of the blue checkmark for verified accounts “deceives” users since anyone can pay to obtain the ‘verified’ status without the company meaningfully verifying who is behind the account. This makes it difficult for users to judge the authenticity of the accounts and the content they engage with. It added that the practice violates the obligation under the DSA to prohibit deceptive design practices on their services.

“This deception exposes users to scams, including impersonation frauds, as well as other forms of manipulation by malicious actors. While the DSA does not mandate user verification, it clearly prohibits online platforms from falsely claiming that users have been verified, when no such verification took place,” the EU said.

The bloc further said that X’s advertisement repository fails to meet the transparency and accessibility requirements of the DSA. The platform failed to meet its obligations to provide researchers with access to the platform’s public data. For instance, X’s terms of service prohibit eligible researchers from independently accessing its public data, including through scraping, it said.

Story continues below this ad

“Moreover, X’s processes for researchers’ access to public data impose unnecessary barriers, effectively undermining research into several systemic risks in the European Union,” it added.

Key features of the Digital Services Act

The DSA came into effect in 2023, with the aim of “regulating the obligations of digital services, including marketplaces, that act as intermediaries in their role of connecting consumers with goods, services, and content.” Its features include:

* Faster removals, opportunity to challenge: Social media companies are required to add “new procedures for faster removal” of content deemed illegal or harmful. They must explain to users how their content takedown policy works. Users can challenge takedown decisions, and seek out-of-court settlements.

* Bigger platforms have greater responsibility: The legislation has junked the one-size-fits-all approach and put a greater burden of accountability on the big tech companies. Under the DSA, ‘Very Large Online Platforms’ (VLOPs) and ‘Very Large Online Search Engines’ (VLOSEs), that is, platforms with more than 45 million users in the EU, have more stringent requirements.

Story continues below this ad

* Direct supervision by the European Commission: These requirements and their enforcement will be centrally supervised by the European Commission itself, ensuring that companies are not able to sidestep the legislation at the member-state level.

* More transparency on how algorithms work: VLOPs and VLOSEs will face transparency measures and scrutiny of how their algorithms work, and will be required to conduct systemic risk analysis and reduction to drive accountability about the societal impacts of their products. VLOPs must allow regulators and researchers to access their data to assess compliance and identify systemic risks of illegal or harmful content.

* Clearer identifiers for ads and who’s paying for them: Online platforms must ensure that users can easily identify advertisements and understand who presents or pays for the ads. They must not display personalised ads directed towards minors or based on sensitive personal data.