{"id":9048,"date":"2026-03-28T15:05:52","date_gmt":"2026-03-28T15:05:52","guid":{"rendered":"https:\/\/web3anchor.com\/index.php\/2026\/03\/28\/biden-judge-freezes-trump-administrations-move-against-ai-firm-fueling-battle-over-security-authority\/"},"modified":"2026-03-28T15:05:52","modified_gmt":"2026-03-28T15:05:52","slug":"biden-judge-freezes-trump-administrations-move-against-ai-firm-fueling-battle-over-security-authority","status":"publish","type":"post","link":"https:\/\/web3anchor.com\/index.php\/2026\/03\/28\/biden-judge-freezes-trump-administrations-move-against-ai-firm-fueling-battle-over-security-authority\/","title":{"rendered":"Biden judge freezes Trump administration\u2019s move against AI firm, fueling battle over security authority"},"content":{"rendered":"<div id=\"beyondwords-wrapper\" class=\"beyondwords-wrapper\"><\/div>\n<p class=\"speakable\">A <u>federal <\/u><u>judge<\/u>\u2019s decision to block the Trump administration from banning AI firm Anthropic from Department of War use is igniting a debate over whether the ruling pushes courts into national security decision-making.<\/p>\n<p class=\"speakable\">The ruling, issued late Thursday by U.S. District Judge Rita Lin, a Biden appointee to the Northern District of California, pauses the administration\u2019s broader effort to bar the company while the case proceeds, though it does not explicitly require the Pentagon to use Anthropic. The judge also gave the government one week to appeal.<\/p>\n<p>Under Secretary of War Emil Michael wrote on X that the ruling contained &#8220;dozens of factual errors&#8221; and was issued &#8220;during a time of conflict,&#8221; arguing it &#8220;seeks to upend the (president\u2019s) role as Commander in Chief&#8221; and disrupt the department\u2019s ability to conduct military operations.<\/p>\n<p><strong><u>A BRAVE MARINE COLONEL TOOK ON THE PENTAGON \u2014 AND PAID THE PRICE FOR IT<\/u><\/strong><\/p>\n<p>Michael said the administration views Anthropic as still designated a supply chain risk pending appeal, signaling officials are disputing the scope and effect of the court\u2019s injunction.<\/p>\n<p>Lin said the Pentagon\u2019s move to designate <u>Anthropic as a national security risk<\/u> was&nbsp;&#8220;likely both contrary to law and arbitrary and capricious.&#8221;<\/p>\n<p>&#8220;Nothing in the governing statute supports the Orwellian notion that an American company may be branded a potential adversary and saboteur of the U.S. for expressing disagreement with the government,&#8221; Lin said.<\/p>\n<p>&#8220;Can a judge order the Department of War to use a vendor that is a security risk? No, but also yes? Judge Lin (Biden N.D. California) tries to stop President Trump\/Secretary Hegseth from banning Anthropic. But acknowledges they can choose not to use it?&#8221; one X user Eric Wess wrote on the social media platform.&nbsp;<\/p>\n<div class=\"image-ct inline\">\n<div class=\"m\"><\/div>\n<\/p>\n<\/div>\n<div class=\"image-ct inline\">\n<div class=\"m\"><\/div>\n<\/p>\n<\/div>\n<p>Others described the ruling as &#8220;pure judicial activism&#8221; and accused the judge of interfering in a national security decision.<\/p>\n<p>But supporters of the decision \u2014 including a bipartisan group of nearly 150 retired federal and state judges \u2014 say the administration overstepped, warning the Pentagon\u2019s use of a &#8220;supply chain risk&#8221; designation appeared improperly applied and could chill free speech and legitimate business activity.<\/p>\n<p>In a March 3 letter, the Pentagon had notified Anthropic it would be designated a supply chain risk to national security. That designation ordered that no contractor, supplier or partner doing business with the United States military may conduct commercial activity with Anthropic.<\/p>\n<p><strong><u>PALANTIR EXECUTIVE SAYS AI ENABLING RAPID BATTLEFIELD PLANNING AND HIGH-SPEED US STRIKE OPERATIONS<\/u><\/strong><\/p>\n<p>The legal fight follows a broader dispute between the <u>Pentagon and Anthropic<\/u> over how the company\u2019s AI system, Claude, can be used in military operations. Claude is the only commercial AI system approved for classified use.&nbsp;<\/p>\n<p>War Secretary Pete Hegseth had warned Anthropic it would face termination of its $200 million contract, awarded in July 2025, or be designated a supply chain risk if it did not allow its AI platform to be approved for all lawful uses.&nbsp;<\/p>\n<p>Anthropic insisted it would not allow Claude to be used for fully autonomous weapons or mass surveillance of Americans.&nbsp;<\/p>\n<p>Pentagon officials say such uses already are not permitted, emphasizing that humans remain in the loop for lethal decisions and that the military does not conduct domestic surveillance, but maintain that private companies cannot dictate how their systems are used in lawful operations.<\/p>\n<p>Lin pointed to the breadth of the measures \u2014 including a government-wide ban and contractor restrictions \u2014 saying they did not appear &#8220;tailored to the stated national security concern&#8221; and instead &#8220;look(ed) like an attempt to cripple Anthropic.<\/p>\n<div class=\"image-ct inline\">\n<div class=\"m\"><\/div>\n<\/p>\n<\/div>\n<p>Anthropic welcomed the decision, saying in a statement: &#8220;We\u2019re grateful to the court for moving swiftly, and pleased they agree Anthropic is likely to succeed on the merits.&#8221;<\/p>\n<p>Hegseth described CEO Dario Amodei and Anthropic of a &#8220;master class in arrogance&#8221; and a &#8220;textbook case of how not to do business with the United States Government&#8221; in a Feb. 27 post on X.&nbsp;<\/p>\n<p>OpenAI has emerged as a key alternative, securing a <u>Pentagon deal<\/u> to deploy its models on classified systems as tensions with Anthropic escalated.&nbsp;<\/p>\n<p>Still, Anthropic has not been fully displaced \u2014 its Claude system remains deeply embedded in military workflows, and replacing it would take time.<\/p>\n<p>  <!--&gt;--> <\/p>\n<p><\/p>\n","protected":false},"excerpt":{"rendered":"<p>A federal judge\u2019s decision to block the Trump administration from banning AI firm Anthropic from Department of War use is igniting a debate over whether the ruling pushes courts into national security decision-making. The ruling, issued late Thursday by U.S. District Judge Rita Lin, a Biden appointee to the Northern District of California, pauses the [&hellip;]<\/p>\n","protected":false},"author":1,"featured_media":9049,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[2],"tags":[],"class_list":["post-9048","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-world-news"],"_links":{"self":[{"href":"https:\/\/web3anchor.com\/index.php\/wp-json\/wp\/v2\/posts\/9048","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/web3anchor.com\/index.php\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/web3anchor.com\/index.php\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/web3anchor.com\/index.php\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/web3anchor.com\/index.php\/wp-json\/wp\/v2\/comments?post=9048"}],"version-history":[{"count":0,"href":"https:\/\/web3anchor.com\/index.php\/wp-json\/wp\/v2\/posts\/9048\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/web3anchor.com\/index.php\/wp-json\/wp\/v2\/media\/9049"}],"wp:attachment":[{"href":"https:\/\/web3anchor.com\/index.php\/wp-json\/wp\/v2\/media?parent=9048"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/web3anchor.com\/index.php\/wp-json\/wp\/v2\/categories?post=9048"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/web3anchor.com\/index.php\/wp-json\/wp\/v2\/tags?post=9048"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}