{"id":12909,"date":"2026-04-22T19:24:08","date_gmt":"2026-04-22T19:24:08","guid":{"rendered":"https:\/\/www.europesays.com\/ai\/12909\/"},"modified":"2026-04-22T19:24:08","modified_gmt":"2026-04-22T19:24:08","slug":"undressed-victims-file-class-action-lawsuit-against-xai-for-grok-deepfakes","status":"publish","type":"post","link":"https:\/\/www.europesays.com\/ai\/12909\/","title":{"rendered":"Undressed victims file class action lawsuit against xAI for Grok deepfakes"},"content":{"rendered":"<p>A class of individuals who say they were victimized by nude or undressed deepfakes generated by Grok have filed a lawsuit against parent company xAI, calling the tool \u201ca generative artificial intelligence chatbot that humiliates and sexually exploits women and girls by undressing them and posing them in sexual positions in deepfake images publicly posted on X.\u201d<\/p>\n<p>The <a href=\"https:\/\/www.documentcloud.org\/documents\/26513947-govuscourtscand46318410\/\" rel=\"nofollow noopener\" target=\"_blank\">lawsuit<\/a>, filed Jan. 23 in the U.S. District Court of Northern California, alleges\u00a0 that xAI executives knew Grok could generate explicit, nonconsensual images from real photos of victims, failed\u00a0 to implement industry standard safeguards, and instead moved to \u201ccapitalize on the internet\u2019s seemingly insatiable appetite for humiliating non-consensual sexual images.\u201d<\/p>\n<p>\u201cxAI\u2019s conduct is despicable and has harmed thousands of women who were digitally stripped and forced into sexual situations that they never consented to and who now face the very real risk that those public images will surface in their lives where viewers may not be able to distinguish whether they are real or fake,\u201d the lawsuit stated.<\/p>\n<p>There are at least 100 individuals involved in the lawsuit. The plaintiffs, who are suing under the anonymous name \u201cJane Doe, on behalf of herself and all others similarly situated,\u201d cited <a href=\"https:\/\/www.nytimes.com\/2026\/01\/22\/technology\/grok-x-ai-elon-musk-deepfakes.html\" rel=\"nofollow noopener\" target=\"_blank\">data<\/a> compiled by the New York Times showing that over a nine-day period between the end of December and the beginning of January, Grok generated 4.4 million images, of which at least 1.8 million were estimated to be sexualized deepfakes of women. Another <a href=\"https:\/\/counterhate.com\/research\/grok-floods-x-with-sexualized-images\/\" rel=\"nofollow noopener\" target=\"_blank\">analysis<\/a> from the Center for Countering Digital Hate estimated that as many as three million of the images contained sexualized depictions of women, men and children.<\/p>\n<p>\u201cX users flooded Grok with these requests, and Grok obliged,\u201d the lawsuit stated.<\/p>\n<p>The suit claims that xAI took a number of actions to encourage users to create \u201cnudified\u201d content, including a feature that would allow other users to prompt Grok to manipulate photos on X simply by tagging a person\u2019s handle, providing Grok with a \u201cspicy\u201d option where a user could click on a photo and generate controversial content, including sexualized deepfakes, and failing to implement any prompt filtering that would have prevented sexualized deepfake requests.<\/p>\n<p>XAI owner Elon Musk fueled the controversy by asking Grok on X to generate a photo of himself in a bikini. As backlash grew, Musk announced the feature would be limited to paying subscribers, sparking more criticism that the company was\u00a0 profiting off the tool\u2019s abusive capability.<\/p>\n<p>Musk has since put forth several different defenses, at one point denying that Grok was even generating illegal sexualized content. On Jan. 14, he posted on X that he was \u201cnot aware of any naked underage images generated by Grok. Literally zero.\u201d<\/p>\n<p>As CyberScoop has <a href=\"https:\/\/cyberscoop.com\/elon-musk-x-grok-deepfake-crisis-section-230\/\" rel=\"nofollow noopener\" target=\"_blank\">reported<\/a>, legal experts believe Grok\u2019s undressing capability \u2013 which researchers say goes beyond generating bikini or lingerie images and included images of fully nude women, men and children, or victims covered in bodily fluids \u2013 may expose xAI and Musk to a broad range of U.S. and international laws against sexualized deepfakes, digital fraud, and the distribution of child sexual abuse material.<\/p>\n<p>In addition to X\u2019s embedded Grok tool, <a href=\"https:\/\/www.wired.com\/story\/grok-is-generating-sexual-content-far-more-graphic-than-whats-on-x\/\" rel=\"nofollow noopener\" target=\"_blank\">researchers<\/a> have said that they were also able to easily generate even more graphic\u00a0 nonconsensual pornographic content through Grok\u2019s main website.<\/p>\n<p>The class action suit is the latest legal development\u00a0 to hit xAI and Musk over the episode. The European Union, the UK, South Korea, Canada, Brazil and others have opened formal investigations into whether xAI violated domestic laws. Leaders in the UK, India, Malaysia, Indonesia have all threatened to restrict or ban X unless more is done.<\/p>\n<p>Meanwhile, the U.S. federal government, including the Federal Trade Commission and the Department of Justice, have remained silent.<\/p>\n<p>But even in the United States, Musk is likely to face increasing pressure <a href=\"https:\/\/cyberscoop.com\/california-ag-investigates-xai-grok-nonconsensual-deepfakes-defiance-act\/\" rel=\"nofollow noopener\" target=\"_blank\">from states<\/a>. On the same day the suit was filed, 35 State Attorneys General <a href=\"https:\/\/attorneygeneral.delaware.gov\/wp-content\/uploads\/sites\/50\/2026\/01\/Letter-to-xAI-_FINAL.pdf\" rel=\"nofollow noopener\" target=\"_blank\">wrote<\/a> to Musk following a meeting with xAI officials expressing \u201cdeep concern\u201d over the company\u2019s actions.<\/p>\n<p>The state officials said they were \u201ccommitted\u201d to investigations and prosecutions in this area and pressed xAI to do more to curb the Grok-enabled abuse.<\/p>\n<p>\u201cAs several of us conveyed to you in our recent discussion, halting this kind of abusive and illegal behavior is an utmost priority for the undersigned Attorneys General,\u201d they wrote. \u201cThe creation and dissemination of child sexual abuse material is a crime. In many states, this is true even where the material has been manipulated or is synthetic. Various state and federal civil and criminal laws also forbid the creation of nonconsensual intimate images and provide remedies to victims.\u201d<\/p>\n<p>While there are numerous AI nudifying tools, they wrote that \u201cGrok merits special attention given evidence that it both promoted and facilitated the production and public dissemination of such images, and made it all as easy as the click of a button.\u201d<\/p>\n<p>\t\t\t\t\t<img decoding=\"async\" class=\"author-card__image\" src=\"https:\/\/www.europesays.com\/ai\/wp-content\/uploads\/2026\/04\/1776885848_454_ea8b076b398ee48b71cfaecf898c582b.jpeg\" alt=\"Derek B. Johnson\"\/><\/p>\n<p>\n\t\t\tWritten by Derek B. Johnson<br \/>\n\t\t\tDerek B. Johnson is a reporter at CyberScoop, where his beat includes cybersecurity, elections and the federal government. Prior to that, he has provided award-winning coverage of cybersecurity news across the public and private sectors for various publications since 2017. Derek has a bachelor\u2019s degree in print journalism from Hofstra University in New York and a master\u2019s degree in public policy from George Mason University in Virginia.\t\t<\/p>\n","protected":false},"excerpt":{"rendered":"A class of individuals who say they were victimized by nude or undressed deepfakes generated by Grok have&hellip;\n","protected":false},"author":2,"featured_media":12910,"comment_status":"","ping_status":"","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[10],"tags":[10168,2657,140,6364,1109,2899],"class_list":{"0":"post-12909","1":"post","2":"type-post","3":"status-publish","4":"format-standard","5":"has-post-thumbnail","7":"category-xai","8":"tag-class-action-lawsuit","9":"tag-deepfakes","10":"tag-elon-musk","11":"tag-grok","12":"tag-social-media","13":"tag-xai"},"_links":{"self":[{"href":"https:\/\/www.europesays.com\/ai\/wp-json\/wp\/v2\/posts\/12909","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.europesays.com\/ai\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.europesays.com\/ai\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.europesays.com\/ai\/wp-json\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/www.europesays.com\/ai\/wp-json\/wp\/v2\/comments?post=12909"}],"version-history":[{"count":0,"href":"https:\/\/www.europesays.com\/ai\/wp-json\/wp\/v2\/posts\/12909\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.europesays.com\/ai\/wp-json\/wp\/v2\/media\/12910"}],"wp:attachment":[{"href":"https:\/\/www.europesays.com\/ai\/wp-json\/wp\/v2\/media?parent=12909"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.europesays.com\/ai\/wp-json\/wp\/v2\/categories?post=12909"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.europesays.com\/ai\/wp-json\/wp\/v2\/tags?post=12909"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}