Close Menu
Canadian ReviewsCanadian Reviews
  • What’s On
  • Reviews
  • Digital World
  • Lifestyle
  • Travel
  • Trending
  • Web Stories
Trending Now
Putin says Russia can supply oil, gas to Europe as energy prices soar

Putin says Russia can supply oil, gas to Europe as energy prices soar

Your guide to celebrating St. Patrick’s Day in Calgary

Your guide to celebrating St. Patrick’s Day in Calgary

Everything from the last week of everything is gambling now

Everything from the last week of everything is gambling now

Toronto wants to host an outdoor NHL Game at Rogers Centre, Canada Reviews

Toronto wants to host an outdoor NHL Game at Rogers Centre, Canada Reviews

International Air Arrivals to U.S. Decline, Outbound Travel Rises in 2025 Q3

International Air Arrivals to U.S. Decline, Outbound Travel Rises in 2025 Q3

‘3 Body Problem’ Season 2 Confirmed to Have Reduced Episode Count

‘3 Body Problem’ Season 2 Confirmed to Have Reduced Episode Count

9th Mar: Baby Einstein: Closer Look (2025), 5 Episodes [TV-Y] (6/10)

9th Mar: Baby Einstein: Closer Look (2025), 5 Episodes [TV-Y] (6/10)

Facebook X (Twitter) Instagram
  • Privacy
  • Terms
  • Advertise
  • Contact us
Facebook X (Twitter) Instagram Pinterest Vimeo
Canadian ReviewsCanadian Reviews
  • What’s On
  • Reviews
  • Digital World
  • Lifestyle
  • Travel
  • Trending
  • Web Stories
Newsletter
Canadian ReviewsCanadian Reviews
You are at:Home » Employees across OpenAI and Google support Anthropic’s lawsuit against the Pentagon
Employees across OpenAI and Google support Anthropic’s lawsuit against the Pentagon
Digital World

Employees across OpenAI and Google support Anthropic’s lawsuit against the Pentagon

9 March 20264 Mins Read

On Monday, Anthropic filed its lawsuit against the Department of Defense over being designated as a supply chain risk. Hours later, nearly 40 employees from OpenAI and Google — including Jeff Dean, Google’s chief scientist and Gemini lead — filed an amicus brief in support of Anthropic’s lawsuit, detailing their concerns over the Trump administration’s decision and the technology’s risks and implications.

The news follows a dramatic few weeks for Anthropic, in which the Trump administration labeled the company a supply chain risk — a designation typically reserved for foreign companies that the government deems a potential risk to national security in some way — after Anthropic stood firm on two red lines regarding acceptable use cases for military use of its technology: domestic mass surveillance and fully autonomous weapons (or AI systems with the power to kill with no human involvement). Negotiations broke down, followed by public insults and other AI companies stepping in to sign contracts allowing “any lawful use” of their technology.

The supply chain risk designation not only prevents Anthropic from working on military contracts, it also blacklists other companies if they used Anthropic products in their line of work for the Pentagon, forcing them to uproot Claude if they wished to maintain their lucrative contracts. As the first model cleared for classified intelligence, however, Anthropic’s tools are already deeply integrated into the Pentagon’s work — so much so that just hours after Defense Secretary Pete Hegseth announced the designation, the U.S. military reportedly used Claude in the campaign that killed the leader of Iran, Ayatollah Ali Khamenei.

The amicus brief seeks to make the points that Anthropic’s supply chain risk designation “is improper retaliation that harms the public interest” and that the concerns behind Anthropic’s red lines “are real and require a response.” It also makes the point that Anthropic’s two red lines are worth revisiting, stating that “mass domestic surveillance powered by AI poses profound risks to democratic governance — even in responsible hands” and that “fully autonomous lethal weapons systems present risks that must also be addressed.”

The group behind the amicus brief described themselves as “engineers, researchers, scientists, and other professionals employed at U.S. frontier artificial intelligence laboratories.”

“We build, train, and study the large-scale AI systems that serve a wide range of users and deployments, including in the consequential domains of national security, law enforcement, and military operations,” the group wrote. “We submit this brief not as spokespeople for any single company, but in our individual capacities as professionals with direct knowledge of what these systems can and cannot do, and what is at stake when their deployment outpaces the legal and ethical frameworks designed to govern them.”

On the domestic mass surveillance front, the group said that though data on American citizens exists everywhere in the form of surveillance cameras, geolocation data, social media posts, financial transactions, and more, “what does not yet exist is the AI layer that transforms this sprawling, fragmented data landscape into a unified, real-time surveillance apparatus.” Right now, they wrote, these data streams are siloed, but if AI were used to connect them, it could combine “face recognition data with location history, transaction records, social graphs, and behavioral patterns across hundreds of millions of people simultaneously.”

When it comes to lethal autonomous weapons specifically, the group said that they can be unreliable in new or unclear conditions that don’t align with the environment they were trained in — meaning that they “cannot be trusted to identify targets with perfect accuracy, and they are incapable of making the subtle contextual tradeoffs between achieving an objective and accounting for collateral effects that a human can.” Additionally, the group wrote, lethal autonomous weapons systems’ potential for hallucination means that it’s important for humans to be involved in the decision-making process “before a lethal munition is launched at a human target” — especially since the system’s chain of reasoning is often not available to operators and unclear even to the system’s developers.

The group behind the amicus brief wrote, “We are diverse in our politics and philosophies, but we are united in the conviction that today’s frontier AI systems present risks when deployed to enable domestic mass surveillance or the operation of autonomous lethal weapons systems without human oversight, and that those risks require some kind of guardrails, whether via technical safeguards or usage restrictions.”

Share. Facebook Twitter Pinterest LinkedIn Reddit WhatsApp Telegram Email

Related Articles

Everything from the last week of everything is gambling now

Everything from the last week of everything is gambling now

Digital World 9 March 2026
‘Cash Apples’ is giving away 0,000 to people who click on trees in a web browser

‘Cash Apples’ is giving away $500,000 to people who click on trees in a web browser

Digital World 9 March 2026
Google’s latest Pixel Watches have fallen to their lowest prices ever

Google’s latest Pixel Watches have fallen to their lowest prices ever

Digital World 9 March 2026
Yashica’s new retro point-and-shoot revival sounds surprisingly capable for 0

Yashica’s new retro point-and-shoot revival sounds surprisingly capable for $100

Digital World 9 March 2026
The iPhone 17E is good, but you probably shouldn’t buy it

The iPhone 17E is good, but you probably shouldn’t buy it

Digital World 9 March 2026
iPad Air review 2026: the M4 and other chip bumps make a difference

iPad Air review 2026: the M4 and other chip bumps make a difference

Digital World 9 March 2026
Top Articles
As an ER doc and a mom. Here are five things I don’t let my kids do because the risks are too high | Canada Voices

As an ER doc and a mom. Here are five things I don’t let my kids do because the risks are too high | Canada Voices

11 January 2026252 Views
Old family photos collecting dust? Here’s how to get rid of them without letting go of the memories | Canada Voices

Old family photos collecting dust? Here’s how to get rid of them without letting go of the memories | Canada Voices

27 December 2025206 Views
9 Longest-Lasting Nail Polishes, Tested by Top Manicurists

9 Longest-Lasting Nail Polishes, Tested by Top Manicurists

25 January 2026179 Views
These BookTok influencers are finding success in turning reading into a game | Canada Voices

These BookTok influencers are finding success in turning reading into a game | Canada Voices

27 December 2025115 Views
Demo
Don't Miss
‘3 Body Problem’ Season 2 Confirmed to Have Reduced Episode Count
What's On 9 March 2026

‘3 Body Problem’ Season 2 Confirmed to Have Reduced Episode Count

Picture Credit: Netflix Netflix’s ambitious sci-fi epic 3 Body Problem is currently deep in production…

9th Mar: Baby Einstein: Closer Look (2025), 5 Episodes [TV-Y] (6/10)

9th Mar: Baby Einstein: Closer Look (2025), 5 Episodes [TV-Y] (6/10)

5 Ontario parks where you can enjoy your March break

5 Ontario parks where you can enjoy your March break

Employees across OpenAI and Google support Anthropic’s lawsuit against the Pentagon

Employees across OpenAI and Google support Anthropic’s lawsuit against the Pentagon

About Us
About Us

Canadian Reviews is your one-stop website for the latest Canadian trends and things to do, follow us now to get the news that matters to you.

Facebook X (Twitter) Pinterest YouTube WhatsApp
Our Picks
Putin says Russia can supply oil, gas to Europe as energy prices soar

Putin says Russia can supply oil, gas to Europe as energy prices soar

Your guide to celebrating St. Patrick’s Day in Calgary

Your guide to celebrating St. Patrick’s Day in Calgary

Everything from the last week of everything is gambling now

Everything from the last week of everything is gambling now

Most Popular
Why You Should Consider Investing with IC Markets

Why You Should Consider Investing with IC Markets

28 April 202429 Views
OANDA Review – Low costs and no deposit requirements

OANDA Review – Low costs and no deposit requirements

28 April 2024361 Views
LearnToTrade: A Comprehensive Look at the Controversial Trading School

LearnToTrade: A Comprehensive Look at the Controversial Trading School

28 April 202476 Views
© 2026 ThemeSphere. Designed by ThemeSphere.
  • Privacy Policy
  • Terms of use
  • Advertise
  • Contact us

Type above and press Enter to search. Press Esc to cancel.