Back to all stories

Senate Democrats Draft AI Limits on Autonomous Weapons and Domestic Surveillance

Axios reports that Senate Democrats, led by Sen. Adam Schiff (D‑Calif.), are drafting legislation to codify federal guardrails on the use of artificial intelligence in fully autonomous weapons and domestic mass surveillance, with an eye toward attaching it to this year’s must‑pass National Defense Authorization Act. The move follows a sharp standoff between the Trump administration and AI company Anthropic, which was designated a ‘supply chain risk’ after refusing the Pentagon unrestricted access to its models for uses including mass surveillance of Americans and weapons that fire without human involvement. Defense Secretary Pete Hegseth has publicly pushed to rapidly embed AI across military operations and is demanding that AI vendors provide unfettered access to their systems, drawing bipartisan criticism — including retiring Republican Sen. Thom Tillis, who called the administration’s handling of Anthropic ‘sophomoric.’ Sen. Mark Kelly (D‑Ariz.) signaled broader support for limits, saying at Brookings that it is reasonable to expect clear boundaries on what AI should and should not be used for in the U.S. military. The fight is emerging as one of the first concrete tests of how Congress will respond to battlefield AI adoption and domestic spying concerns in the Trump administration’s Iran war era, with tech‑policy circles and civil‑liberties advocates on social media warning that whatever language lands in the NDAA could quietly set the rules of engagement for algorithmic warfare and surveillance at home.

AI and National Security Policy Congress and Trump Administration Clashes

📌 Key Facts

  • Sen. Adam Schiff is drafting legislation to establish ‘commonsense safeguards’ on federal use of AI in fully autonomous weapons and domestic mass surveillance.
  • Senate Democrats are targeting the 2026 National Defense Authorization Act as a vehicle for the AI guardrails.
  • The Trump administration recently designated Anthropic a supply chain risk after the company refused to give the Pentagon unfettered access to its AI models for mass surveillance and weapons that fire without human involvement.
  • Defense Secretary Pete Hegseth is pressing for rapid, across‑the‑board military integration of AI and demands unrestricted access from AI contractors.
  • Retiring GOP Sen. Thom Tillis criticized the administration’s treatment of Anthropic as ‘sophomoric,’ signaling bipartisan unease.

📊 Relevant Data

Facial recognition AI systems have higher error rates for Black and dark-skinned individuals compared to White individuals, with studies showing error rates up to 34% higher for darker skin tones.

Facial recognition AI is flawed and racially biased. Police keep using it anyway — Fortune

Autonomous weapons systems pose risks to compliance with international humanitarian law, including challenges in ensuring proportionality and distinction between combatants and civilians.

Why we should limit the autonomy of AI-enabled weapons — Nature

Mass surveillance disproportionately impacts communities of color and immigrants, contributing to higher rates of arrests and incarceration for Black and Indigenous populations compared to their percentage in the U.S. population.

We study mass surveillance for social control, and we see Trump laying the groundwork to contain people of color and immigrants — USC Dornsife

A majority of Americans (54%) believe AI-powered mass surveillance is too dangerous and violates privacy and civil liberties, with higher opposition among Democrats (63%) than Republicans (45%).

Americans Want Humans in Control of AI — Information Technology and Innovation Foundation (ITIF)

The development of lethal autonomous weapons systems (LAWS) could undermine core principles in armed conflict such as responsibility, proportionality, and distinction, potentially leading to an AI arms race.

Governing Lethal Autonomous Weapons in a New Era of Military AI — Trends Research & Advisory

📰 Source Timeline (1)

Follow how coverage of this story developed over time