The nadir of a very low week came when a user of Elon Musk’s Grok AI instructed it to take an image of Renée Nicole Good – slumped at the steering wheel of her car, apparently covered in her own blood, moments after being shot dead by a federal immigration agent – and place her in a bikini. The chatbot complied. When the user expressed disgust, Grok responded cheerfully: “Glad you approve! What other wardrobe malfunctions can I fix for you?”

This obscenity did not emerge from nowhere. It was the grotesque culmination of a week in which the killing of a 37-year-old woman on a Minneapolis street became a laboratory for every pathology afflicting the information ecosystem. Within hours of Good’s death, competing narratives had calcified.

Senior officials in the Trump administration described her as a “violent rioter” who had “weaponised her vehicle” in an “act of domestic terrorism”. The city’s mayor accused the officials of lying. Each side pointed to the same videos as proof of their version of events. The BBC, the New York Times and others attempted to establish what the multiple clips of the incident from different perspectives actually revealed about how it had unfolded. Their efforts were scorned by ideological adversaries including US vice-president JD Vance.

Then came the attempts to identify the masked shooter. Users instructed Grok to “unmask” the agent from footage in which his face was covered. The AI obligingly generated what it presented as a revealed face. This fabrication was paired with a name – Steve Grove – of unclear provenance, and the resulting package was disseminated as fact. By Thursday morning, a gun shop owner in Missouri and the publisher of the Minnesota Star Tribune were fielding threats and abuse. Neither had any connection to the shooting. The actual agent has since been identified by journalists using traditional reporting methods.

We have arrived at a moment that media theorists have long predicted but which nonetheless feels vertiginous now that it is here. For most of its history, the photograph has been understood as bearing a special relationship to truth. Unlike a painting or a written account, the camera was thought to capture reality mechanically, without human intervention. “The camera never lies” became a cliché precisely because it expressed a widely held assumption about the medium’s essential nature.

Renée Nicole Good was killed on Wednesday morning in Minneapolis. Within hours that reality had become raw material for political narratives, AI fabrications and grotesque violations of her dignity

This faith was always partly misguided. Photographs have been cropped, retouched, staged and manipulated since the medium’s earliest days. The framing of a shot, the moment chosen, the context withheld – all these editorial decisions have always shaped what we see and understand. What the camera captured was never simply “reality” but a particular slice of it, selected and presented by human hands and minds.

Yet there remained a residual evidentiary weight to photographic and video evidence. However manipulated the presentation, there was something in front of the lens when the shutter clicked. The Minneapolis footage exists because something happened on that street. Even when we argue about what the video shows, we are arguing about interpretations of the same physical event.

The Minneapolis videos will continue to be analysed in investigations and courts. Over time, some interpretations may be confirmed while others are disproved. But the broader lesson is already clear. We have entered a period in which photographic images can no longer claim even a residual presumption of truth. They are fragments, prompts, starting points for competing realities.

Generative AI introduces something categorically new and different. Grok did not reveal the shooter’s face; it invented one. The bikini image was not edited from reality; it was conjured from nothing. We have moved from a world where images could be selectively presented to one where they can be wholly fabricated, with no underlying reality to which they correspond.

Ireland’s Minister for Media Patrick O’Donovan, responded to last week’s Grok controversy by deleting his X account, telling Limerick’s Live 95 that he did not feel comfortable remaining on a platform where such images were permitted. This is understandable as a personal choice but is not a substitute for a coherent policy response.

The problem is not that individual users are making bad choices on neutral platforms. Grok is not a passive tool that unfortunate souls happen to misuse. It is a product designed with certain capabilities and certain guardrails – or, more pertinently, without them.

For too long, technology companies have been permitted to present themselves as mere conduits. The central fallacy in much political thinking about platforms and AI remains the insistence on neutrality. The tools are neutral, we are told; responsibility lies with users. That position was always convenient and always wrong. In the era of generative AI it has become actively dangerous. These systems do not merely transmit; they create. Their outputs reflect decisions about what should be possible and what should be prevented. Design choices, training data and monetisation models are all decisions, and they have consequences.

Grok itself acknowledged the images of Good may violate US legislation criminalising non-consensual intimate images, including AI deepfakes. That the system could generate such content while simultaneously flagging its potential illegality tells you everything about the gap between what these companies know and what they choose to do.

The events of the past week should concentrate minds. Renée Nicole Good was killed on Wednesday morning in Minneapolis. Within hours that reality had become raw material for political narratives, AI fabrications and grotesque violations of her dignity. It is clear the regulatory frameworks which are supposed to respond to this chain of events are not fit for purpose. Whether the political will exists to address this is another question entirely.