Elon Musk’s social media platform X is under a lot of pressure, as the European regulators increase their investigations into Grok, which is the built-in AI chatbot of the platform. The European Commission announced on Monday that the tool was responsible for the distribution of sexualized images of women and children, and called the content illegal and unacceptable according to European legal standards.

Spicy Mode Feature

The comments by the European Commission came after a report that Grok was creating images of women and children with little or no clothing on request, which according to X, was a feature named “spicy mode.” A Commission spokesperson, Thomas Regnier, told the journalists that the regulators were fully aware of the feature and its ramifications.

Regnier said,

“This is not spicy. This is illegal. This is appalling. This is disgusting. This is how we see it, and this has no place in Europe”.

The honesty of the statements reflects upon the strong stand that the European authorities are taking on the matter, specifically as the fear of generative AI being misused to create non-consensual imagery is greatly increasing.

Britain Demands Answers From X and xAI

The media regulator Ofcom in the UK was also involved, where they demanded that X should disclose how Grok was capable of creating naked images of people and sexualized illustrations of kids. Ofcom reported that it was looking whether the platform had breached its legal obligation to keep users safe from illegal content.

Also, a spokesperson said it had made “urgent contact” with X and xAI to know the extent of the compliance with the UK law. In the UK, the law forbids the creation or distribution of non-consensual intimate images or child sexual abuse material, which includes AI-created content as well. It is mandatory for the platforms to take all necessary measures to prevent their users from coming across such material and remove it quickly once detected.

X’s Response Fuels Further Controversy

At first, X did not reply to the requests asking for comments on the statements made by the European Commission and Ofcom. The company in its latest message terminated the reporting with the words “Legacy Media Lies.”

On social media, Musk has shown himself to be humorous about the whole thing, as he responded with laughing emojis to the posts depicting public figures rendered through Grok as though they were in bikinis. Such dismissive gestures have only fueled the fire of criticism coming from regulators and politicians who say that X is not taking the potential problem of AI-generated content seriously enough.

Global Backlash Builds, but U.S Stays Silent

The criticisms against Grok are not going to be limited to Brussels and London. The French authorities have reported X to the prosecutors and the media watchdog saying that the AI-based images, which are “sexual and sexist”, are “manifestly illegal.” Besides, the Indian authorities have also made X accountable for the obscene content they claim has been produced with the help of the chatbot.

As a part of the global reaction, the United States stands out being the most silent. The U.S regulators have not yet made any public statements regarding the issue despite the eruption of the controversy. The Federal Communications Commission, Federal Trade Commission, and the Department of Justice all declined to comment and did not respond to inquiries.

Bottom Line

As generative AI tools steadily integrate into social media platforms, the Grok controversy is turning out to be a significant challenge for governments regarding accountability enforcement. For regulators in Europe and Great Britain, the point is clear that technological advancement is no reason for disregard of the law, specifically when it is about protecting minors and banning non-consensual sexual imagery.

The question is whether X will be able to implement its protections rapidly and whether other regions will mimic Europe’s approach or not. The answer may bring enduring impacts not just for Musk’s platform, but also for the entire AI sector.