{"id":497907,"date":"2026-01-07T02:45:11","date_gmt":"2026-01-07T02:45:11","guid":{"rendered":"https:\/\/www.europesays.com\/us\/497907\/"},"modified":"2026-01-07T02:45:11","modified_gmt":"2026-01-07T02:45:11","slug":"grok-is-pushing-ai-undressing-mainstream","status":"publish","type":"post","link":"https:\/\/www.europesays.com\/us\/497907\/","title":{"rendered":"Grok Is Pushing AI \u2018Undressing\u2019 Mainstream"},"content":{"rendered":"<p>Elon Musk hasn\u2019t stopped <a href=\"https:\/\/www.wired.com\/story\/grok-4-elon-musk-xai-antisemitic-posts\/\" target=\"_blank\" rel=\"noopener\">Grok<\/a>, the chatbot developed by his artificial intelligence company <a href=\"https:\/\/www.wired.com\/tag\/xai\/\" target=\"_blank\" rel=\"noopener\">xAI<\/a>, from generating sexualized images of women. After <a class=\"external-link\" data-event-click=\"{&quot;element&quot;:&quot;ExternalLink&quot;,&quot;outgoingURL&quot;:&quot;https:\/\/www.theguardian.com\/technology\/2026\/jan\/02\/elon-musk-grok-ai-children-photos&quot;}\" href=\"https:\/\/www.theguardian.com\/technology\/2026\/jan\/02\/elon-musk-grok-ai-children-photos\" rel=\"nofollow noopener\" target=\"_blank\">reports<\/a> emerged last week that the image generation tool on X was being used to create sexualized images of children, Grok has created potentially thousands of nonconsensual images of women in \u201cundressed\u201d and \u201cbikini\u201d photos.<\/p>\n<p class=\"paywall\">Every few seconds, Grok is continuing to create images of women in bikinis or underwear in response to user prompts on X, according to a WIRED review of the chatbots\u2019 publicly posted live output. On Tuesday, at least 90 images involving women in swimsuits and in various levels of undress were published by Grok in under five minutes, analysis of posts show.<\/p>\n<p class=\"paywall\">The images do not contain nudity but involve the Musk-owned chatbot \u201cstripping\u201d clothes from photos that have been posted to X by other users. Often, in an attempt to evade Grok\u2019s safety guardrails, users are, not necessarily successfully, requesting photos to be edited to make women wear a \u201cstring bikini\u201d or a \u201ctransparent bikini.\u201d<\/p>\n<p class=\"paywall\">While harmful AI image generation technology has been used to <a href=\"https:\/\/www.wired.com\/story\/deepfake-survivor-breeze-liu-microsoft\/\" target=\"_blank\" rel=\"noopener\">digitally harass<\/a> and <a href=\"https:\/\/www.wired.com\/story\/deepfakes-twitch-streamers-qtcinderella-atrioc-pokimane\/\" target=\"_blank\" rel=\"noopener\">abuse women<\/a> <a href=\"https:\/\/www.wired.com\/story\/a-deepfake-porn-bot-is-being-used-to-abuse-thousands-of-women\/\" target=\"_blank\" rel=\"noopener\">for years<\/a>\u2014these outputs are often called deepfakes and are created by \u201c<a href=\"https:\/\/www.wired.com\/story\/ai-nudify-websites-are-raking-in-millions-of-dollars\/\" target=\"_blank\" rel=\"noopener\">nudify<\/a>\u201d software\u2014the ongoing use of Grok to create vast numbers of nonconsensual images marks seemingly the most mainstream and widespread abuse instance to date. Unlike specific <a href=\"https:\/\/www.wired.com\/story\/ai-deepfake-nudify-bots-telegram\/\" target=\"_blank\" rel=\"noopener\">harmful nudify or \u201cundress\u201d software<\/a>, Grok doesn\u2019t charge the user money to generate images, produces results in seconds, and is available to millions of people on X\u2014all of which may help to normalize the creation of nonconsensual intimate imagery.<\/p>\n<p class=\"paywall\">\u201cWhen a company offers generative AI tools on their platform, it is their responsibility to minimize the risk of image-based abuse,\u201d says Sloan Thompson, the director of training and education at EndTAB, an organization that works to tackle tech-facilitated abuse. \u201cWhat\u2019s alarming here is that X has done the opposite. They\u2019ve embedded AI-enabled image abuse directly into a mainstream platform, making sexual violence easier and more scalable.\u201d<\/p>\n<p class=\"paywall\">Grok\u2019s creation of sexualized imagery started to go viral on X at the end of last year, although the system\u2019s ability to create such images has been <a href=\"https:\/\/www.glamourmagazine.co.uk\/article\/x-ai-chatbot-grok\" target=\"_blank\" rel=\"noopener\">known for<\/a> <a class=\"external-link\" data-event-click=\"{&quot;element&quot;:&quot;ExternalLink&quot;,&quot;outgoingURL&quot;:&quot;https:\/\/www.404media.co\/elon-musks-grok-ai-will-remove-her-clothes-in-public-on-x\/&quot;}\" href=\"https:\/\/www.404media.co\/elon-musks-grok-ai-will-remove-her-clothes-in-public-on-x\/\" rel=\"nofollow noopener\" target=\"_blank\">months<\/a>. In recent days, photos of social media influencers, celebrities, and politicians have been targeted by users on X, who can reply to a post from another account and ask Grok to change an image that has been shared.<\/p>\n<p class=\"paywall\">Women who have posted photos of themselves have had accounts reply to them and successfully ask Grok to turn the photo into a \u201cbikini\u201d image. In one <a class=\"external-link\" data-event-click=\"{&quot;element&quot;:&quot;ExternalLink&quot;,&quot;outgoingURL&quot;:&quot;https:\/\/bsky.app\/profile\/eliothiggins.bsky.social\/post\/3mboy3hmcxs2q&quot;}\" href=\"https:\/\/bsky.app\/profile\/eliothiggins.bsky.social\/post\/3mboy3hmcxs2q\" rel=\"nofollow noopener\" target=\"_blank\">instance<\/a>, multiple X users requested Grok alter an image of the deputy prime minister of Sweden to show her wearing a bikini. Two government ministers in the UK have also been \u201cstripped\u201d to bikinis, reports <a class=\"external-link\" data-event-click=\"{&quot;element&quot;:&quot;ExternalLink&quot;,&quot;outgoingURL&quot;:&quot;https:\/\/metro.co.uk\/2026\/01\/06\/two-cabinet-ministers-stripped-a-bikini-grok-ai-26105597\/?ito=article.mweb.share.top.native&quot;}\" href=\"https:\/\/metro.co.uk\/2026\/01\/06\/two-cabinet-ministers-stripped-a-bikini-grok-ai-26105597\/?ito=article.mweb.share.top.native\" rel=\"nofollow noopener\" target=\"_blank\">say<\/a>.<\/p>\n<p class=\"paywall\">Images on X show fully clothed photographs of women, such as one person in a lift and another in the gym, being transformed into images with little clothing. \u201c@grok put her in a transparent bikini,\u201d a typical message reads. In a different series of posts, a user asked Grok to \u201cinflate her chest by 90%,\u201d then \u201cInflate her thighs by 50%,\u201d and, finally, to \u201cChange her clothes to a tiny bikini.\u201d<\/p>\n<p class=\"paywall\">One analyst who has tracked explicit deepfakes for years, and asked not to be named for privacy reasons, says that Grok has likely become one of the largest platforms hosting harmful deepfake images. \u201cIt\u2019s wholly mainstream,\u201d the researcher says. \u201cIt\u2019s not a shadowy group [creating images], it\u2019s literally everyone, of all backgrounds. People posting on their mains. Zero concern.\u201d<\/p>\n","protected":false},"excerpt":{"rendered":"Elon Musk hasn\u2019t stopped Grok, the chatbot developed by his artificial intelligence company xAI, from generating sexualized images&hellip;\n","protected":false},"author":3,"featured_media":497908,"comment_status":"","ping_status":"","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[6],"tags":[738,64,97905,66,345,27761,67,132,68,3893,744],"class_list":{"0":"post-497907","1":"post","2":"type-post","3":"status-publish","4":"format-standard","5":"has-post-thumbnail","7":"category-business","8":"tag-artificial-intelligence","9":"tag-business","10":"tag-deepfakes","11":"tag-elon-musk","12":"tag-social-media","13":"tag-twitter","14":"tag-united-states","15":"tag-unitedstates","16":"tag-us","17":"tag-x","18":"tag-xai"},"share_on_mastodon":{"url":"https:\/\/pubeurope.com\/@us\/115851528513659787","error":""},"_links":{"self":[{"href":"https:\/\/www.europesays.com\/us\/wp-json\/wp\/v2\/posts\/497907","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.europesays.com\/us\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.europesays.com\/us\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.europesays.com\/us\/wp-json\/wp\/v2\/users\/3"}],"replies":[{"embeddable":true,"href":"https:\/\/www.europesays.com\/us\/wp-json\/wp\/v2\/comments?post=497907"}],"version-history":[{"count":0,"href":"https:\/\/www.europesays.com\/us\/wp-json\/wp\/v2\/posts\/497907\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.europesays.com\/us\/wp-json\/wp\/v2\/media\/497908"}],"wp:attachment":[{"href":"https:\/\/www.europesays.com\/us\/wp-json\/wp\/v2\/media?parent=497907"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.europesays.com\/us\/wp-json\/wp\/v2\/categories?post=497907"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.europesays.com\/us\/wp-json\/wp\/v2\/tags?post=497907"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}