{"id":6557,"date":"2026-04-17T05:46:27","date_gmt":"2026-04-17T05:46:27","guid":{"rendered":"https:\/\/www.europesays.com\/ai\/6557\/"},"modified":"2026-04-17T05:46:27","modified_gmt":"2026-04-17T05:46:27","slug":"how-to-spot-ai-generated-clips-of-crying-us-soldiers-on-social-media","status":"publish","type":"post","link":"https:\/\/www.europesays.com\/ai\/6557\/","title":{"rendered":"How to spot AI-generated clips of crying US soldiers on social media"},"content":{"rendered":"<p>\n\t\t\t\t\t\t&#8220;Mom, Dad, checking in,&#8221; one person who appears to be a U.S. service member tells the camera. &#8220;I\u2019m good, okay? No need to worry about me.&#8221;Surrounded by snow and ice, she continues: &#8220;It\u2019s freezing out here and I\u2019m soaked. But for the safety of the American people, and to help make America great again, I\u2019m standing my ground. Do I earn your follow and your thumbs up yet? Stay safe.&#8221;That isn\u2019t a U.S. service member or even a real person. The video was generated with artificial intelligence. It\u2019s one of many fake videos showing similar scenarios \u2014 U.S. service members crying or in dire conditions. Some show female soldiers in barren locations, in uniform, sniffling or crying. They address their parents directly. Others show male soldiers tearfully addressing their partners.Fake videos of service members weeping and seeking empathy from viewers have surfaced during other conflicts, including the Russia-Ukraine war, and have continued during the Iran war. Thirteen U.S. service members have been killed as of April 15. Creators have a financial incentive to produce emotional content. Viral videos can earn money for the creators through social media platforms\u2019 programs. Or their videos can direct users to websites prompting them to make purchases, or steal their personal information. PolitiFact found at least 11 TikTok, Facebook and YouTube accounts that primarily post AI-generated videos of service members, with more than 174,000 followers combined. The fake military videos gained 29.6 million views collectively, with the views for each account averaging from 628 to 466,192. Some video labels disclose that they are AI-generated, but even in those cases, people commenting on the videos don\u2019t seem to realize they\u2019re fake.PolitiFact contacted Meta, YouTube and TikTok about the accounts. The accounts we inquired about later became unavailable as of April 15.A TikTok spokesperson said the platform\u2019s Community Guidelines prohibit AI-generated content that presents misleading information on matters of public importance, such as an active conflict, and that they have removed the accounts we shared.Facebook removed the accounts we flagged for violating its policies, a spokesperson said, adding that the pages were not monetized. YouTube also removed a channel we inquired about, a spokesperson said, for violating their spam policies.Shannon Razsadin, chief executive officer of the nonprofit organization Military Family Advisory Network, said military families are encountering such videos and questioning what is real. &#8220;These videos heighten anxiety by presenting scenarios that may not reflect reality, which can compound fear for families already navigating a lot of unknowns,&#8221; she said.Mary Bennett Doty, associate director of programs at We the Veterans &amp; Military Families, said such content adds to inflammatory rhetoric and could deepen division.Videos show emotional service members talking about their families, fallen soldiersThe accounts often use one type of background and script for their videos, often sticking to videos of only men or only women, or pivoted from one to the other. One page named &#8220;US Soldier Legacy,&#8221; for example, containeds videos of women crying and talking over the sound of jets, with smoke in the background.In one video posted by a TikTok account named &#8220;Usa Soldier Life,&#8221; with more than 764,000 views on TikTok, a man stands in the foreground with a flag-draped coffin in the distance, saying through tears, &#8220;I\u2019m gonna miss you brother, I hope, I hope you know how much we love you. I love you, man. Rest easy.&#8221;Other pages primarily show male service members, often holding photos presumably of their loved ones and addressing their partners. One page\u2019s captions say the videos are their last messages to their families. Accounts often seek to monetize content Many accounts don\u2019t appear to seek money, but some provide a way for viewers to potentially contact the account holders, such as through a telephone number or a website, for reasons that could include selling items or leading people to a phishing scam.These accounts follow a trend that uses AI to create synthetic &#8220;influencers&#8221; and other deepfake content related to politics.For example, one profile of a female service member named &#8220;Jessica Foster&#8221; that gained 1 million followers on Instagram while posting images of her with President Donald Trump and other political figures was AI-generated. The account linked to a separate page where the profile sold exclusive fetish content.Accounts like these can make money through viewers\u2019 engagement with the content, or can direct users to other websites that sell products. Daniel Schiff, a Purdue University assistant professor of technology policy, said people risk exposure to cyberattacks and information theft.&#8221;Accounts may post sympathetic or incendiary information to leverage people&#8217;s emotions or draw their attention,&#8221; he said. &#8220;Once that account has enough followers, they may post links to external content, which could range from selling clothing to selling intimate content.&#8221; Schiff said many of these accounts are driven by economic motives. In one video, the AI-generated character cries and says he\u2019s thinking about home, then promotes a &#8220;shop link&#8221; in the account\u2019s bio description as he continues crying. The bio did not feature a link. One Facebook page with 31,000 followers called &#8220;Brave Marine,&#8221; featuring similar videos of male soldiers, linked to a website featuring job listings for a maritime company. A telephone number listed for the website\u2019s registration has previously been connected to fraud campaigns.The content undermines trust in information sources that military families rely on, Razsadin said.&#8221;Many official entities like military branches, helping agencies or military service organizations like ourselves also use social media to communicate verified content to military families,&#8221; she said. How to identify fake videos of service membersIf you have doubts about a video\u2019s authenticity, check the account that posted it. If it consistently posts videos with different people saying the same things, it\u2019s one indicator the videos could be AI-generated.The profile\u2019s creation date and posting volume also can be a signal. Some accounts we saw were created around the time the Iran war began, and have been posting consistently since.&#8221;Many of these accounts are relatively new and engage in fairly uniform patterns of influence-style posting,&#8221; Schiff said. Some dubious accounts primarily post attractive young women in uniform. Gregory Daddis, a Texas A&amp;M University history professor who served in the U.S. Army for 26 years, said that even when the women in the videos have muddy or scratched faces, they are still portrayed as attractive.&#8221;Nearly perfectly waxed eyebrows across the board seems telling to me,&#8221; he said.The uniforms also can be a giveaway. In one April 12 video, a female service member said, &#8220;Dad, it\u2019s almost Christmas. I miss you so much, but for the safety of the American people, I have to hold the line out here. Could you tap the little red plus on my profile to support me? Um, I love you both. Stay safe.&#8221;Looking closer at her uniform shows her name is gibberish, and the &#8220;U.S.&#8221; has three periods. Illegible text and spelling or grammar mistakes are common in these videos. Daddis said the rank insignia is out of place in several AI-generated videos, or feature inaccurate symbols. On combat uniforms, those are located in a patch on the middle of the chest, but some videos show them to the side or missing. Some videos still have a watermark indicating they were made with AI. One example is Veo, Google\u2019s AI video creator. Watermarks can be cropped out, but another tell that a video was AI-generated is its length: Veo, for one, can typically make videos only up to eight seconds long. &#8220;Be cautious of content that relies heavily on emotion but lacks specifics,&#8221; Raszadin said.Staff Writer Maria Brice\u00f1o contributed to this report.\n\t\t\t\t\t<\/p>\n<p>&#8220;Mom, Dad, checking in,&#8221; one person who appears to be a U.S. service member tells the camera. &#8220;I\u2019m good, okay? No need to worry about me.&#8221;<\/p>\n<p>Surrounded by snow and ice, she <a href=\"https:\/\/mvau.lt\/media\/ea9993b1-4516-4f36-97f9-8f9584f10060\" target=\"_blank\" rel=\"nofollow noopener\">continues<\/a>: &#8220;It\u2019s freezing out here and I\u2019m soaked. But for the safety of the American people, and to help make America great again, I\u2019m standing my ground. Do I earn your follow and your thumbs up yet? Stay safe.&#8221;<\/p>\n<p>That isn\u2019t a U.S. service member or even a real person. The video was generated with artificial intelligence. <\/p>\n<p>It\u2019s one of many fake videos showing similar scenarios \u2014 U.S. service members crying or in dire conditions. Some show female soldiers in barren locations, in uniform, sniffling or crying. They address their parents directly. Others show male soldiers tearfully addressing their partners.<\/p>\n<p>Fake videos of service members weeping and seeking empathy from viewers have surfaced during other conflicts, including the <a href=\"https:\/\/www.france24.com\/en\/europe\/20251113-videos-ukrainian-soldiers-refusing-fight-deepfakes-russian-streamers\" target=\"_blank\" rel=\"nofollow noopener\">Russia-Ukraine war<\/a>, and have continued during the Iran war. <a href=\"https:\/\/www.reuters.com\/world\/middle-east\/how-many-people-have-been-killed-us-israel-war-iran-2026-04-07\/\" target=\"_blank\" rel=\"nofollow noopener\">Thirteen U.S. service members<\/a> have been killed as of April 15. <\/p>\n<p>Creators have a financial incentive to produce emotional content. Viral videos can earn money for the creators through social media platforms\u2019 programs. Or their videos can direct users to websites prompting them to make purchases, or steal their personal information. <\/p>\n<p>PolitiFact found at least 11 TikTok, Facebook and YouTube accounts that primarily post AI-generated videos of service members, with more than 174,000 followers combined. The fake military videos gained 29.6 million views collectively, with the views for each account averaging from 628 to 466,192. Some video labels disclose that they are AI-generated, but even in those cases, people commenting on the videos don\u2019t seem to realize they\u2019re fake.<\/p>\n<p>PolitiFact contacted Meta, YouTube and TikTok about the accounts. The accounts we inquired about later became unavailable as of April 15.<\/p>\n<p>A TikTok spokesperson said the platform\u2019s <a href=\"https:\/\/www.tiktok.com\/safety\/en\/policies-and-engagement\/integrity-authenticity\" target=\"_blank\" rel=\"nofollow noopener\">Community Guidelines<\/a> prohibit AI-generated content that presents misleading information on matters of public importance, such as an active conflict, and that they have removed the accounts we shared.<\/p>\n<p>Facebook removed the accounts we flagged for violating its policies, a spokesperson said, adding that the pages were not monetized. <\/p>\n<p>YouTube also removed a channel we inquired about, a spokesperson said, for violating their spam policies.<\/p>\n<p>Shannon Razsadin, chief executive officer of the nonprofit organization Military Family Advisory Network, said military families are encountering such videos and questioning what is real. <\/p>\n<p>&#8220;These videos heighten anxiety by presenting scenarios that may not reflect reality, which can compound fear for families already navigating a lot of unknowns,&#8221; she said.<\/p>\n<p>Mary Bennett Doty, associate director of programs at We the Veterans &amp; Military Families, said such content adds to inflammatory rhetoric and could deepen division.<\/p>\n<p>Videos show emotional service members talking about their families, fallen soldiers<\/p>\n<p>The accounts often use one type of background and script for their videos, often sticking to videos of only men or only women, or pivoted from one to the other. <\/p>\n<p>\t\t<img decoding=\"async\" class=\" aspect-ratio-original lazyload lazyload-in-view\" alt=\"Screenshots from TikTok and YouTube of accounts posting AI-generated videos of soldiers crying.\" title=\"Ai soldiers\" src=\"https:\/\/www.europesays.com\/ai\/wp-content\/uploads\/2026\/04\/1a169c28-6e8e-423a-b851-5f2d0c4c2ff9.png\"\/><\/p>\n<p>\n\t\tPolitiFact\t<\/p>\n<p>\t\t\t\t\t\t\t\t\t\tScreenshots from TikTok and YouTube of accounts posting AI-generated videos of soldiers crying.<\/p>\n<p>One page named &#8220;US Soldier Legacy,&#8221; for example, containeds videos of women crying and talking over the sound of jets, with smoke in the background.<\/p>\n<p>In one video posted by a TikTok account named &#8220;Usa Soldier Life,&#8221; with more than 764,000 views on TikTok, a man stands in the foreground with a flag-draped coffin in the distance, saying through tears, &#8220;I\u2019m gonna miss you brother, I hope, I hope you know how much we love you. I love you, man. Rest easy.&#8221;<\/p>\n<p><a href=\"https:\/\/archive.ph\/jlhXC\" target=\"_blank\" rel=\"nofollow noopener\">Other<\/a> <a href=\"https:\/\/archive.ph\/tkHcz\" target=\"_blank\" rel=\"nofollow noopener\">pages<\/a> primarily show male service members, often holding photos presumably of their loved ones and addressing their partners. One page\u2019s captions say the videos are their last messages to their families. <\/p>\n<p>Accounts often seek to monetize content <\/p>\n<p>Many accounts don\u2019t appear to seek money, but some provide a way for viewers to potentially contact the account holders, such as through a telephone number or a website, for reasons that could include selling items or leading people to a phishing scam.<\/p>\n<p>These accounts follow a trend that uses AI to create synthetic &#8220;influencers&#8221; and other deepfake content related to politics.<\/p>\n<p>For example, <a href=\"https:\/\/www.fastcompany.com\/91507096\/jessica-foster-popular-maga-influencer-ai-model\" target=\"_blank\" rel=\"nofollow noopener\">one<\/a> <a href=\"https:\/\/www.washingtonpost.com\/technology\/2026\/03\/20\/jessica-foster-maga-dream-girl-ai-fake\/\" target=\"_blank\" rel=\"nofollow noopener\">profile<\/a> of a female service member named &#8220;Jessica Foster&#8221; that gained 1 million followers on Instagram while posting images of her with President Donald Trump and other political figures was AI-generated. The account linked to a separate page where the profile sold exclusive fetish content.<\/p>\n<p>Accounts like these can make money through viewers\u2019 engagement with the content, or can direct users to other websites that sell products. Daniel Schiff, a Purdue University assistant professor of technology policy, said people risk exposure to cyberattacks and information theft.<\/p>\n<p>&#8220;Accounts may post sympathetic or incendiary information to leverage people&#8217;s emotions or draw their attention,&#8221; he said. &#8220;Once that account has enough followers, they may post links to external content, which could range from selling clothing to selling intimate content.&#8221; Schiff said many of these accounts are driven by economic motives. <\/p>\n<p>In <a href=\"https:\/\/www.tiktok.com\/@us_militarry\/video\/7626471552472714509\" target=\"_blank\" rel=\"nofollow noopener\">one video<\/a>, the AI-generated character cries and says he\u2019s thinking about home, then promotes a &#8220;shop link&#8221; in the account\u2019s bio description as he continues crying. The bio did not feature a link. One <a href=\"https:\/\/archive.ph\/jlhXC\" target=\"_blank\" rel=\"nofollow noopener\">Facebook page<\/a> with 31,000 followers called &#8220;Brave Marine,&#8221; featuring similar videos of male soldiers, linked to a website featuring job listings for a maritime company. A telephone number listed for the website\u2019s registration has previously been connected to <a href=\"https:\/\/www.sygnia.co\/blog\/inside-recovery-scam-network-legal-impersonation\/\" target=\"_blank\" rel=\"nofollow noopener\">fraud campaigns<\/a>.<\/p>\n<p>The content undermines trust in information sources that military families rely on, Razsadin said.<\/p>\n<p>&#8220;Many official entities like military branches, helping agencies or military service organizations like ourselves also use social media to communicate verified content to military families,&#8221; she said. <\/p>\n<p>How to identify fake videos of service members<\/p>\n<p>If you have doubts about a video\u2019s authenticity, check the account that posted it. If it consistently posts videos with different people saying the same things, it\u2019s one indicator the videos could be AI-generated.<\/p>\n<p>The profile\u2019s creation date and posting volume also can be a signal. Some accounts we saw were created around the time the Iran war began, and have been posting consistently since.<\/p>\n<p>&#8220;Many of these accounts are relatively new and engage in fairly uniform patterns of influence-style posting,&#8221; Schiff said. <\/p>\n<p>Some dubious accounts primarily post attractive young women in uniform. Gregory Daddis, a Texas A&amp;M University history professor who served in the U.S. Army for 26 years, said that even when the women in the videos have muddy or scratched faces, they are still portrayed as attractive.<\/p>\n<p>&#8220;Nearly perfectly waxed eyebrows across the board seems telling to me,&#8221; he said.<\/p>\n<p>The uniforms also can be a giveaway. In <a href=\"https:\/\/mvau.lt\/media\/a29f4a06-e715-437c-b90b-a2e5361ad072\" target=\"_blank\" rel=\"nofollow noopener\">one April 12 video<\/a>, a female service member said, &#8220;Dad, it\u2019s almost Christmas. I miss you so much, but for the safety of the American people, I have to hold the line out here. Could you tap the little red plus on my profile to support me? Um, I love you both. Stay safe.&#8221;<\/p>\n<p>Looking closer at her uniform shows her name is gibberish, and the &#8220;U.S.&#8221; has three periods. <\/p>\n<p>\t\t<img decoding=\"async\" class=\" aspect-ratio-original lazyload lazyload-in-view\" alt=\"Screenshots of AI-generated videos of soldiers crying, with the imperfections circled to show it is fake.\" title=\"AI soldiers misspellings\" src=\"https:\/\/www.europesays.com\/ai\/wp-content\/uploads\/2026\/04\/2f9baca5-af32-4f09-89b2-7a51c0d19f3e.png\"\/><\/p>\n<p>\n\t\tPolitiFact\t<\/p>\n<p>\t\t\t\t\t\t\t\t\t\tScreenshots of AI-generated videos of soldiers crying, with the imperfections circled to show it is fake.<\/p>\n<p>Illegible text and spelling or grammar mistakes are common in these videos. Daddis said the rank insignia is out of place in several AI-generated videos, or feature inaccurate <a href=\"https:\/\/www.war.gov\/Resources\/Insignias\/\" target=\"_blank\" rel=\"nofollow noopener\">symbols<\/a>. On <a href=\"https:\/\/www.army.mil\/uniforms\/#lbclosed\" target=\"_blank\" rel=\"nofollow noopener\">combat uniforms<\/a>, those are located in a patch on the middle of the chest, but some videos show them to the side or missing. <\/p>\n<p>Some videos still have a watermark indicating they were made with AI. One example is Veo, Google\u2019s AI video creator. Watermarks can be cropped out, but another tell that a video was AI-generated is its length: Veo, for one, can typically make videos only up to <a href=\"https:\/\/gemini.google\/overview\/video-generation\/\" target=\"_blank\" rel=\"nofollow noopener\">eight seconds long<\/a>. <\/p>\n<p>\t\t<img decoding=\"async\" class=\" aspect-ratio-original lazyload lazyload-in-view\" alt=\"Screenshot from Facebook of an AI-generated video of a soldier crying.\ufeff\" title=\"AI soldiers\" src=\"https:\/\/www.europesays.com\/ai\/wp-content\/uploads\/2026\/04\/b86e79b6-1f51-4bef-8749-3914a0206f51.png\"\/><\/p>\n<p>\n\t\tPolitiFact\t<\/p>\n<p>\t\t\t\t\t\t\t\t\t\tScreenshot from Facebook\u00a0of an AI-generated video of a soldier crying.<\/p>\n<p>&#8220;Be cautious of content that relies heavily on emotion but lacks specifics,&#8221; Raszadin said.<\/p>\n<p>Staff Writer Maria Brice\u00f1o contributed to this report.<\/p>\n<p>\t\t\t\t\t\t\t\t\t\t\t<script async src=\"\/\/www.tiktok.com\/embed.js\"><\/script><\/p>\n","protected":false},"excerpt":{"rendered":"&#8220;Mom, Dad, checking in,&#8221; one person who appears to be a U.S. service member tells the camera. &#8220;I\u2019m&hellip;\n","protected":false},"author":2,"featured_media":6558,"comment_status":"","ping_status":"","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[2],"tags":[5998,24,5985,6001,25,5994,2657,5988,1148,5996,6003,5995,1605,6005,5993,2246,5987,6000,5991,4731,6006,5990,6004,5989,5992,5997,6007,756,5999,5986,6002,2454,4555,1928],"class_list":{"0":"post-6557","1":"post","2":"type-post","3":"status-publish","4":"format-standard","5":"has-post-thumbnail","7":"category-ai","8":"tag-account","9":"tag-ai","10":"tag-ai-generated-videos","11":"tag-american-people","12":"tag-artificial-intelligence","13":"tag-daniel-schiff","14":"tag-deepfakes","15":"tag-emotional-content","16":"tag-facebook","17":"tag-fake-videos","18":"tag-follower","19":"tag-gregory-daddis","20":"tag-iran-war","21":"tag-many-fake-video","22":"tag-mary-bennett-doty","23":"tag-mcnd","24":"tag-military-families","25":"tag-military-family","26":"tag-military-uniforms","27":"tag-misinformation","28":"tag-page","29":"tag-phishing","30":"tag-profile","31":"tag-scam","32":"tag-shannon-razsadin","33":"tag-social-media-monetization","34":"tag-social-medium","35":"tag-tiktok","36":"tag-u-s-service-member","37":"tag-u-s-service-members","38":"tag-uniform","39":"tag-video","40":"tag-website","41":"tag-youtube"},"_links":{"self":[{"href":"https:\/\/www.europesays.com\/ai\/wp-json\/wp\/v2\/posts\/6557","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.europesays.com\/ai\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.europesays.com\/ai\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.europesays.com\/ai\/wp-json\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/www.europesays.com\/ai\/wp-json\/wp\/v2\/comments?post=6557"}],"version-history":[{"count":0,"href":"https:\/\/www.europesays.com\/ai\/wp-json\/wp\/v2\/posts\/6557\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.europesays.com\/ai\/wp-json\/wp\/v2\/media\/6558"}],"wp:attachment":[{"href":"https:\/\/www.europesays.com\/ai\/wp-json\/wp\/v2\/media?parent=6557"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.europesays.com\/ai\/wp-json\/wp\/v2\/categories?post=6557"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.europesays.com\/ai\/wp-json\/wp\/v2\/tags?post=6557"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}