{"id":28485,"date":"2026-05-05T20:08:16","date_gmt":"2026-05-05T20:08:16","guid":{"rendered":"https:\/\/www.europesays.com\/ai\/28485\/"},"modified":"2026-05-05T20:08:16","modified_gmt":"2026-05-05T20:08:16","slug":"trump-admin-to-review-ai-models-from-google-microsoft-xai-ahead-of-public-release","status":"publish","type":"post","link":"https:\/\/www.europesays.com\/ai\/28485\/","title":{"rendered":"Trump admin to review AI models from Google, Microsoft, xAI ahead of public release"},"content":{"rendered":"<p class=\"mb-4 text-lg md:leading-8 break-words\">The Trump administration on Tuesday announced that it had reached new agreements with Microsoft, Google DeepMind and Elon Musk&#8217;s xAI to expand collaboration with Big Tech companies in researching <a data-yga=\"{\" ylinkelement=\"\" href=\"https:\/\/www.foxbusiness.com\/category\/artificial-intelligence\" rel=\"nofollow noopener\" target=\"_blank\" data-ylk=\"elm:link;elmt:article_link;slk:artificial intelligence (AI);itc:0;sec:content-canvas\" class=\"link \">artificial intelligence (AI)<\/a> and security.<\/p>\n<p class=\"mb-4 text-lg md:leading-8 break-words\">The Center for AI Standards and Innovation (CAISI), which is part of the Commerce Department&#8217;s National Institute of Standards and Technology, will work with the AI companies on pre-deployment evaluations as well as targeted research into frontier AI capabilities and <a data-yga=\"{\" ylinkelement=\"\" href=\"https:\/\/www.foxbusiness.com\/category\/tech\" rel=\"nofollow noopener\" target=\"_blank\" data-ylk=\"elm:link;elmt:article_link;slk:AI security;itc:0;sec:content-canvas\" class=\"link \">AI security<\/a>.<\/p>\n<p class=\"mb-4 text-lg md:leading-8 break-words\">The new agreements build on previously announced partnerships between CAISI and the companies, supporting information-sharing, driving voluntary product improvements and ensuring a clear understanding in government of AI capabilities and the state of international AI competition.<\/p>\n<p class=\"mb-4 text-lg md:leading-8 break-words\">&#8220;Independent, rigorous measurement science is essential to understanding frontier AI and its national security implications,&#8221; said CAISI Director Chris Fall. &#8220;These expanded industry collaborations help us scale our work in the public interest at a critical moment.&#8221;<\/p>\n<p class=\"mb-4 text-lg md:leading-8 break-words\"><a data-yga=\"{\" ylinkelement=\"\" href=\"https:\/\/www.foxbusiness.com\/technology\/how-ai-exposure-reshaping-jobs-creative-fields\" rel=\"nofollow noopener\" target=\"_blank\" data-ylk=\"elm:link;elmt:article_link;slk:How Ai Exposure Is Reshaping Jobs In Creative Fields;itc:0;sec:content-canvas\" class=\"link \">How Ai Exposure Is Reshaping Jobs In Creative Fields<\/a><\/p>\n<p class=\"mb-4 text-lg md:leading-8 break-words\">Developers frequently provide CAISI with models that have reduced or removed safeguards to evaluate national security-related capabilities and risks.<\/p>\n<p class=\"mb-4 text-lg md:leading-8 break-words\"><a data-yga=\"{\" ylinkelement=\"\" href=\"https:\/\/foxbusiness.onelink.me\/Zkcx\/7ux7xf3k\" rel=\"nofollow noopener\" target=\"_blank\" data-ylk=\"elm:link;elmt:article_link;slk:Read On The Fox Business App;itc:0;sec:content-canvas\" class=\"link \">Read On The Fox Business App<\/a><\/p>\n<p class=\"mb-4 text-lg md:leading-8 break-words\">Evaluators from across <a data-yga=\"{\" ylinkelement=\"\" href=\"https:\/\/www.foxbusiness.com\/category\/government-and-institutions\" rel=\"nofollow noopener\" target=\"_blank\" data-ylk=\"elm:link;elmt:article_link;slk:government agencies;itc:0;sec:content-canvas\" class=\"link \">government agencies<\/a> may participate in evaluations and regularly provide feedback through the TRAINS Taskforce, which is a group of interagency experts focused on AI national security concerns.<\/p>\n<p class=\"mb-4 text-lg md:leading-8 break-words\">CAISI&#8217;s agreements support testing in classified environments and were drafted with flexibility to respond to continued advancements in AI.<\/p>\n<p class=\"mb-4 text-lg md:leading-8 break-words\"><a data-yga=\"{\" ylinkelement=\"\" href=\"https:\/\/www.foxbusiness.com\/technology\/zuckerberg-says-meta-layoffs-tied-ai-spending-wont-rule-out-future-cuts\" rel=\"nofollow noopener\" target=\"_blank\" data-ylk=\"elm:link;elmt:article_link;slk:Zuckerberg Says Meta Layoffs Tied To Ai Spending, Won&#039;t Rule Out Future Cuts;itc:0;sec:content-canvas\" class=\"link \">Zuckerberg Says Meta Layoffs Tied To Ai Spending, Won&#8217;t Rule Out Future Cuts<\/a><\/p>\n<p><img alt=\"Microsoft Logo\" loading=\"lazy\" width=\"960\" height=\"540\" decoding=\"async\" data-nimg=\"1\" class=\"rounded-lg\" style=\"color:transparent\" src=\"https:\/\/www.europesays.com\/ai\/wp-content\/uploads\/2026\/05\/a24a8c2e5922d3789040b3d41ddaa295.jpeg\"\/><\/p>\n<p>Microsoft said the CAISI partnership is needed to build trust and confidence in advanced AI systems.<\/p>\n<p>(Getty Images)<\/p>\n<p class=\"mb-4 text-lg md:leading-8 break-words\"><a data-yga=\"{\" ylinkelement=\"\" href=\"https:\/\/www.foxbusiness.com\/category\/microsoft\" rel=\"nofollow noopener\" target=\"_blank\" data-ylk=\"elm:link;elmt:article_link;slk:Microsoft;itc:0;sec:content-canvas\" class=\"link \">Microsoft<\/a> chief responsible AI officer Natasha Crampton said in a release that the agreements will &#8220;advance the science of AI testing and evaluation, including through collaborative work to test Microsoft&#8217;s frontier models, assess safeguards, and help mitigate national security and large-scale public safety risks.&#8221;<\/p>\n<p class=\"mb-4 text-lg md:leading-8 break-words\">Crampton said that &#8220;ongoing, rigorous testing is essential to building trust and confidence in advanced AI systems.&#8221;<\/p>\n<p class=\"mb-4 text-lg md:leading-8 break-words\"><a data-yga=\"{\" ylinkelement=\"\" href=\"https:\/\/www.foxbusiness.com\/fox-news-tech\/elon-musk-says-he-fool-funding-openai-report\" rel=\"nofollow noopener\" target=\"_blank\" data-ylk=\"elm:link;elmt:article_link;slk:Elon Musk Says He Was A &#039;Fool&#039; For Funding Openai: Report;itc:0;sec:content-canvas\" class=\"link \">Elon Musk Says He Was A &#8216;Fool&#8217; For Funding Openai: Report<\/a><\/p>\n<p class=\"mb-4 text-lg md:leading-8 break-words\">&#8220;Well-constructed tests help us understand whether our systems are working as intended and delivering the benefits they are designed to provide. Testing also helps us stay ahead of risks, such as <a data-yga=\"{\" ylinkelement=\"\" href=\"https:\/\/www.foxbusiness.com\/category\/cyber-security\" rel=\"nofollow noopener\" target=\"_blank\" data-ylk=\"elm:link;elmt:article_link;slk:AI-driven cyberattacks;itc:0;sec:content-canvas\" class=\"link \">AI-driven cyberattacks<\/a> and other criminal misuses of AI systems, that can emerge once advanced AI systems are deployed in the world,&#8221; Crampton explained.<\/p>\n<p class=\"mb-4 text-lg md:leading-8 break-words\">Microsoft also announced a similar agreement with the <a data-yga=\"{\" ylinkelement=\"\" href=\"https:\/\/www.foxbusiness.com\/category\/fox-news-united-kingdom\" rel=\"nofollow noopener\" target=\"_blank\" data-ylk=\"elm:link;elmt:article_link;slk:United Kingdom&#039;s;itc:0;sec:content-canvas\" class=\"link \">United Kingdom&#8217;s<\/a> AI Security Institute (AISI) to govern AI testing and evaluation.<\/p>\n<p class=\"mb-4 text-lg md:leading-8 break-words\">Original article source: <a data-yga=\"{\" ylinkelement=\"\" href=\"https:\/\/www.foxbusiness.com\/technology\/trump-admin-review-ai-models-from-google-microsoft-xai-ahead-public-release\" rel=\"nofollow noopener\" target=\"_blank\" data-ylk=\"elm:link;elmt:article_link;slk:Trump admin to review AI models from Google, Microsoft, xAI ahead of public release;itc:0;sec:content-canvas\" class=\"link \">Trump admin to review AI models from Google, Microsoft, xAI ahead of public release<\/a><\/p>\n","protected":false},"excerpt":{"rendered":"The Trump administration on Tuesday announced that it had reached new agreements with Microsoft, Google DeepMind and Elon&hellip;\n","protected":false},"author":2,"featured_media":28486,"comment_status":"","ping_status":"","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[9],"tags":[18501,18744,5044,140,132,7543,320,18745],"class_list":{"0":"post-28485","1":"post","2":"type-post","3":"status-publish","4":"format-standard","5":"has-post-thumbnail","7":"category-google","8":"tag-caisi","9":"tag-center-for-ai-standards","10":"tag-deepmind","11":"tag-elon-musk","12":"tag-google","13":"tag-google-deepmind","14":"tag-microsoft","15":"tag-national-institute-of-standards-and-technology"},"_links":{"self":[{"href":"https:\/\/www.europesays.com\/ai\/wp-json\/wp\/v2\/posts\/28485","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.europesays.com\/ai\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.europesays.com\/ai\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.europesays.com\/ai\/wp-json\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/www.europesays.com\/ai\/wp-json\/wp\/v2\/comments?post=28485"}],"version-history":[{"count":0,"href":"https:\/\/www.europesays.com\/ai\/wp-json\/wp\/v2\/posts\/28485\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.europesays.com\/ai\/wp-json\/wp\/v2\/media\/28486"}],"wp:attachment":[{"href":"https:\/\/www.europesays.com\/ai\/wp-json\/wp\/v2\/media?parent=28485"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.europesays.com\/ai\/wp-json\/wp\/v2\/categories?post=28485"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.europesays.com\/ai\/wp-json\/wp\/v2\/tags?post=28485"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}