fb tracking

No Lethal AI Weapons, 14 Groups Tell the Pentagon

WASHINGTON, D.C. – The U.S. military should clarify that it will not develop or deploy lethal weapons powered by artificial intelligence (AI), 14 groups said in a letter sent today to U.S. Defense Secretary Lloyd Austin and Deputy Defense Secretary Kathleen Hicks. The letter was co-signed by Public Citizen, the Future of Life Institute, Demand Progress, and Win Without War, among others.

The groups’ letter focuses on the Pentagon’s Replicator program, which proposes to rely heavily on drones to combat Chinese missile strength in a theoretical conflict over Taiwan or at China’s eastern coast. The just-passed appropriations bill includes $200 million in funding for Replicator, with an additional $300 million expected to be devoted to the program.

According to the groups, the Pentagon has not been sufficiently clear about whether the program involves the development and deployment of autonomous weapons. “This is no place for strategic ambiguity. Autonomous weapons are inherently dehumanizing and unethical, no matter whether a human is ‘ultimately’ responsible for the use of force or not,” the letter reads.

“The United States should state plainly that it will not create or deploy killer robots and should work to advance global treaty negotiations to ban such weapons,” said Robert Weissman, president of Public Citizen. “At minimum, the United States should commit that the Replicator Initiative will not involve the use of autonomous weapons. Ambiguity about the Replicator program essentially ensures a catastrophic arms race over autonomous weapons. That’s a race in which all of humanity is the loser.”