{"id":20523,"date":"2026-04-28T19:26:15","date_gmt":"2026-04-28T19:26:15","guid":{"rendered":"https:\/\/www.europesays.com\/ai\/20523\/"},"modified":"2026-04-28T19:26:15","modified_gmt":"2026-04-28T19:26:15","slug":"600-workers-protest-as-google-signs-200-million-secret-pentagon-ai-warfare-deal","status":"publish","type":"post","link":"https:\/\/www.europesays.com\/ai\/20523\/","title":{"rendered":"600+ Workers Protest as Google Signs $200 Million Secret Pentagon AI Warfare Deal"},"content":{"rendered":"<p>As Google on Monday became the latest player in the <a href=\"https:\/\/www.commondreams.org\/tag\/artificial-intelligence\" rel=\"nofollow noopener\" target=\"_blank\">artificial intelligence<\/a> arms race to sign a classified deal with the US Department of Defense, hundreds of <a href=\"https:\/\/www.commondreams.org\/tag\/workers\" rel=\"nofollow noopener\" target=\"_blank\">workers<\/a> at the Silicon Valley giant demanded that its CEO prevent the <a href=\"https:\/\/www.commondreams.org\/tag\/pentagon\" rel=\"nofollow noopener\" target=\"_blank\">Pentagon<\/a> from using the company&#8217;s AI models for covert work.<\/p>\n<p>Reuters <a href=\"https:\/\/www.reuters.com\/technology\/google-signs-classified-ai-deal-with-pentagon-information-reports-2026-04-28\/\" target=\"_blank\" rel=\"nofollow noopener\">reported<\/a> that the $200 million agreement includes safety filters and allows the Pentagon to use Google&#8217;s AI &#8220;for any lawful purpose&#8221; but not for the development of lethal autonomous weapons systems\u2014commonly known as &#8220;killer robots&#8221;\u2014or domestic surveillance without human oversight and control.<\/p>\n<p><a href=\"https:\/\/www.theinformation.com\/articles\/google-signs-classified-ai-deal-pentagon-amid-employee-opposition\" target=\"_blank\" rel=\"nofollow noopener\">According to<\/a> The Information&#8217;s Erin Woo, the deal does not give Google \u201cany right to control or veto lawful government operational decision-making.&#8221;<\/p>\n<p>The agreement also reportedly requires Google to adjust its AI safety settings at the government&#8217;s request.<\/p>\n<p>\u201cWe are proud to be part of a broad consortium of leading AI labs and <a href=\"https:\/\/www.commondreams.org\/tag\/technology\" rel=\"nofollow noopener\" target=\"_blank\">technology<\/a> and cloud companies providing AI services and <a href=\"https:\/\/www.commondreams.org\/tag\/infrastructure\" rel=\"nofollow noopener\" target=\"_blank\">infrastructure<\/a> in support of national security,\u201d a Google spokesperson told The Information.<\/p>\n<p>More than 600 Google employees\u2014many of them from the company&#8217;s DeepMind AI laboratory\u2014sent a letter Monday to CEO Sundar Pichai demanding that he block the <a href=\"https:\/\/www.commondreams.org\/tag\/us-military\" rel=\"nofollow noopener\" target=\"_blank\">US military<\/a> from using the firm&#8217;s artificial intelligence technology for classified projects.<\/p>\n<p>\u201cWe want to see AI benefit humanity; not to see it being used in inhumane or extremely harmful ways,&#8221; the letter says, <a href=\"https:\/\/www.washingtonpost.com\/technology\/2026\/04\/27\/google-employees-letter-ai-pentagon\/\" target=\"_blank\" rel=\"nofollow noopener\">according to<\/a> The <a href=\"https:\/\/www.commondreams.org\/tag\/washington-post\" rel=\"nofollow noopener\" target=\"_blank\">Washington Post<\/a>. &#8220;This includes lethal autonomous weapons and mass surveillance but extends beyond.&#8221;<\/p>\n<p>\u201cThe only way to guarantee that Google does not become associated with such harms is to reject any classified workloads,&#8221; the workers stressed. &#8220;Otherwise, such uses may occur without our knowledge or the power to stop them.&#8221;<\/p>\n<p>Thousands of AI experts have called for a pause on the development and deployment of advanced AI technology. However, tech companies and military officials have argued\u2014much as the <a href=\"https:\/\/www.commondreams.org\/tag\/military-industrial-complex\" rel=\"nofollow noopener\" target=\"_blank\">military-industrial complex<\/a> did with <a href=\"https:\/\/www.commondreams.org\/tag\/nuclear-weapons\" rel=\"nofollow noopener\" target=\"_blank\">nuclear weapons<\/a> during the Cold War\u2014that if the US does not pursue advanced AI, rivals like China will, leaving the US irrecoverably behind.<\/p>\n<p>As US and allied forces from <a href=\"https:\/\/www.commondreams.org\/news\/us-israel-ai-iran\" target=\"_blank\" rel=\"nofollow noopener\">Israel<\/a> to <a href=\"https:\/\/www.kyivpost.com\/post\/73593\" target=\"_blank\" rel=\"nofollow noopener\">Ukraine<\/a> use AI to make <a href=\"https:\/\/www.commondreams.org\/news\/artificial-intelligence-iran-war\" target=\"_blank\" rel=\"nofollow noopener\">life-and-death wartime decisions<\/a>\u2014including selecting attack targets at a rate unfathomable just a few years ago\u2014use of such technology is <a href=\"https:\/\/www.commondreams.org\/news\/gaza-civilian-casualties-2666414736\" target=\"_blank\" rel=\"nofollow noopener\">expediting<\/a> Israel&#8217;s massacres in <a href=\"https:\/\/www.commondreams.org\/tag\/gaza\" rel=\"nofollow noopener\" target=\"_blank\">Gaza<\/a> and <a href=\"https:\/\/www.commondreams.org\/tag\/lebanon\" rel=\"nofollow noopener\" target=\"_blank\">Lebanon<\/a> and US-Israeli killings in Iran. <\/p>\n<p>\u201cHuman lives are already being lost and <a href=\"https:\/\/www.commondreams.org\/tag\/civil-liberties\" rel=\"nofollow noopener\" target=\"_blank\">civil liberties<\/a> put at risk at home and abroad from misuses of the technology we\u2019re playing a key role in building,\u201d the Google workers&#8217; letter states.<\/p>\n<p>The policies and actions of the humans in charge of the US government and military have also stoked fears about their use of AI.<\/p>\n<p>US Defense Secretary Pete Hegseth, for example, has overseen the dismantling of initiatives aimed at reducing wartime harm to civilians\u2014hundreds of thousands of whom have been killed in US-led wars during this century, <a href=\"https:\/\/costsofwar.watson.brown.edu\/costs\/human\" target=\"_blank\" rel=\"nofollow noopener\">according to<\/a> experts. Hegseth has instead promoted &#8220;<a href=\"https:\/\/www.war.gov\/News\/Transcripts\/Transcript\/Article\/4318689\/secretary-of-war-pete-hegseth-addresses-general-and-flag-officers-at-quantico-v\/\" target=\"_blank\" rel=\"nofollow noopener\">maximum lethality<\/a>&#8221; for US forces while <a href=\"https:\/\/www.commondreams.org\/news\/hegseth-rules-of-engagement\" target=\"_blank\" rel=\"nofollow noopener\">expressing disdain<\/a> for what he called &#8220;stupid rules of engagement&#8221; designed to minimize civilian harm. <\/p>\n<p>Critics say their concerns have been validated by actions including the US <a href=\"https:\/\/www.commondreams.org\/news\/trump-lying-about-iran-school-strike\" target=\"_blank\" rel=\"nofollow noopener\">cruise missile strike<\/a> on a girls&#8217; school in Iran that killed 168 <a href=\"https:\/\/www.commondreams.org\/tag\/children\" rel=\"nofollow noopener\" target=\"_blank\">children<\/a> and staff and Israeli airstrikes, many of them using US-supplied bombs, that have killed tens of thousands of Palestinian civilians in <a href=\"https:\/\/www.commondreams.org\/tag\/gaza\" rel=\"nofollow noopener\" target=\"_blank\">Gaza<\/a>. <\/p>\n<p>Companies that have run afoul of the <a href=\"https:\/\/www.commondreams.org\/tag\/trump-administration\" rel=\"nofollow noopener\" target=\"_blank\">Trump administration<\/a> for refusing military AI use requests also risk getting left behind.  Anthropic\u2014maker of the AI assistant Claude\u2014lost a $200 million Pentagon contract and is facing a government blacklist and legal battles after the company <a href=\"https:\/\/www.commondreams.org\/news\/hegseth-jawbones-anthropic\" target=\"_blank\" rel=\"nofollow noopener\">refused<\/a> to loosen safety restrictions on autonomous weapons and surveillance.<\/p>\n<p>Meanwhile, OpenAI, which makes the generative AI platform ChatGPT, <a href=\"https:\/\/www.commondreams.org\/news\/openai\" target=\"_blank\" rel=\"nofollow noopener\">rewrote<\/a> its &#8220;no military use&#8221; policy to allow &#8220;national security&#8221; applications of its products, opening the door to lucrative Pentagon contracts.<\/p>\n<p>Not wanting to get left behind as President <a href=\"https:\/\/www.commondreams.org\/tag\/donald-trump\" rel=\"nofollow noopener\" target=\"_blank\">Donald Trump<\/a> returned to office last year, Google quietly <a href=\"https:\/\/www.commondreams.org\/news\/google-ai-products\" target=\"_blank\" rel=\"nofollow noopener\">pulled back<\/a> its commitment to not use artificial intelligence for harmful purposes, marking a stark departure from the company&#8217;s long-standing founding motto of &#8220;Don&#8217;t be Evil,&#8221; which it <a href=\"https:\/\/www.commondreams.org\/news\/2018\/05\/21\/evil-fine-now-google-ditches-dont-be-evil-company-code-conduct\" target=\"_blank\" rel=\"nofollow noopener\">ditched<\/a> in 2018.<\/p>\n<p>Pentagon contracts followed, and Google <a href=\"https:\/\/www.tradingkey.com\/analysis\/stocks\/us-stocks\/261830130-google-goog-googl-department-of-war-dod-pentagon-gemini-ai-tradingkey\" target=\"_blank\" rel=\"nofollow noopener\">reportedly<\/a> hopes to add $6 billion in AI deals by next year.<\/p>\n<p><a href=\"https:\/\/www.livescience.com\/technology\/artificial-intelligence\/agi-could-now-arrive-as-early-as-2026-but-not-all-scientists-agree\" target=\"_blank\" rel=\"nofollow noopener\">Most AI experts agree<\/a> that it&#8217;s not a matter of if, but when, artificial intelligence surpasses human capabilities. Experts are increasingly viewing AI as a new emerging species, and prominent industry voices\u2014including philosopher Nick Bostrom, Machine Intelligence Research Institute co-founder Eliezer Yudkowsky, and &#8220;Godfather of AI&#8221; Geoffrey Hinton\u2014have noted that when a more intelligent species&#8217; goals conflict with those of a less intelligent one, the less intelligent species tends to lose, and usually catastrophically.<\/p>\n<p>Hinton is so concerned that he quit Google in 2023 so he could <a href=\"https:\/\/www.commondreams.org\/news\/ai-regulations\" target=\"_blank\" rel=\"nofollow noopener\">speak openly<\/a> about the remote but growing risk of AI one day wiping out humanity. <\/p>\n<p>The perceived probability of existentially catastrophic outcomes from AI\u2014known as p(doom)\u2014was once the stuff of jokes. Now, AI experts&#8217; <a href=\"https:\/\/www.linkedin.com\/pulse\/pdoom-most-uncomfortable-metric-tech-right-now-network-outsource-veove\" target=\"_blank\" rel=\"nofollow noopener\">p(doom) predictions<\/a> are watched like weather or market forecasts. Yudkowski has said there&#8217;s a greater than 95% chance of AI-driven catastrophe. <\/p>\n<p>Hinton\u2014who was <a href=\"https:\/\/www.nobelprize.org\/prizes\/physics\/2024\/press-release\/\" target=\"_blank\" rel=\"nofollow noopener\">awarded<\/a> the 2024 Nobel Prize in physics for his work on the neural networks, the foundational technology behind AI\u2014is relatively more optimistic, putting the odds at 10-20%.<\/p>\n<p>&#8220;There are very few examples of more intelligent things being controlled by less intelligent things,&#8221; he said after winning the Nobel Prize. <\/p>\n","protected":false},"excerpt":{"rendered":"As Google on Monday became the latest player in the artificial intelligence arms race to sign a classified&hellip;\n","protected":false},"author":2,"featured_media":20524,"comment_status":"","ping_status":"","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[9],"tags":[24,25,1069,2300,132,1429,14423,141,8310],"class_list":{"0":"post-20523","1":"post","2":"type-post","3":"status-publish","4":"format-standard","5":"has-post-thumbnail","7":"category-google","8":"tag-ai","9":"tag-artificial-intelligence","10":"tag-big-tech","11":"tag-geoffrey-hinton","12":"tag-google","13":"tag-google-ai","14":"tag-killer-robots","15":"tag-sundar-pichai","16":"tag-us-department-of-defense"},"_links":{"self":[{"href":"https:\/\/www.europesays.com\/ai\/wp-json\/wp\/v2\/posts\/20523","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.europesays.com\/ai\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.europesays.com\/ai\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.europesays.com\/ai\/wp-json\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/www.europesays.com\/ai\/wp-json\/wp\/v2\/comments?post=20523"}],"version-history":[{"count":0,"href":"https:\/\/www.europesays.com\/ai\/wp-json\/wp\/v2\/posts\/20523\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.europesays.com\/ai\/wp-json\/wp\/v2\/media\/20524"}],"wp:attachment":[{"href":"https:\/\/www.europesays.com\/ai\/wp-json\/wp\/v2\/media?parent=20523"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.europesays.com\/ai\/wp-json\/wp\/v2\/categories?post=20523"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.europesays.com\/ai\/wp-json\/wp\/v2\/tags?post=20523"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}