KYIV, Ukraine (AP) — Advances in drones in Ukraine have accelerated a long-awaited technological trend that could soon bring the world’s first fully autonomous combat robots to the battlefield, ushering in a new era of warfare.
The longer the war lasts, the more likely drones will be used to identify, select and attack targets without the help of humans, according to military analysts, fighters and artificial intelligence researchers.
That would mark a revolution in military technology as profound as the introduction of the machine gun. Ukraine already has semi-autonomous attack drones and AI-powered anti-drone weapons. Russia also claims to possess AI weaponry, although the claims are unproven. But there are no confirmed instances of a nation engaging robots that have killed completely single-handedly.
Experts say it may only be a matter of time before Russia or Ukraine, or both, deploy them.
“A lot of states are developing this technology,” said Zachary Kallenborn, a weapons innovation analyst at George Mason University. “Clearly, it’s not that difficult.”
The sense of inevitability extends to activists, who have tried for years to ban killer drones but now believe they must settle for trying to restrict the offensive use of weapons.
Ukraine’s digital transformation minister Mykhailo Fedorov agrees that fully autonomous killer drones are “a logical and inevitable next step” in weapons development. He said Ukraine has been doing “a lot of research and development in this direction.”
“I think the potential for this is great in the next six months,” Fedorov told The Associated Press in a recent interview.
Ukrainian Lt. Col. Yaroslav Honchar, co-founder of the non-profit combat drone innovation organization Aerorozvidka, said in a recent interview near the front lines that human war fighters simply cannot process information and make decisions as quickly as the machines.
Ukrainian military leaders currently prohibit the use of fully independent lethal weapons, though that could change, he said.
“We haven’t crossed this line yet, and I say ‘yet’ because I don’t know what will happen in the future.” said Honchar, whose group has spearheaded drone innovation in Ukraine, turning cheap commercial drones into lethal weapons.
Russia could get autonomous AI from Iran or anywhere else. The long-range Shahed-136 explosive drones supplied by Iran have crippled Ukrainian power plants and terrorized civilians, but they are not especially smart. Iran has other drones in its evolving arsenal that it says are AI-powered.
Without much trouble, Ukraine could make its semi-autonomous armed drones fully independent to better survive battlefield jamming, its Western makers say.
Those drones include the US-made Switchblade 600 and the Polish Warmate, both of which currently require a human to choose targets via a live video feed. The AI finishes the job. The drones, technically known as “loitering munitions,” can hover for minutes over a target, waiting for a clean shot.
“The technology to achieve a fully autonomous mission with the Switchblade practically exists today,” said Wahid Nawabi, CEO of AeroVironment, its manufacturer. That will require a policy change, to remove humans from the decision-making circuit, which he estimates is three years from now.
Drones can already recognize targets such as armored vehicles using cataloged images. But there is disagreement about whether the technology is reliable enough to ensure that the machines do not goof up and take the lives of non-combatants.
AP asked the defense ministries of Ukraine and Russia if they have used autonomous weapons offensively, and if they would agree not to use them if the other side similarly agreed. None responded.
If either side went on the attack with full AI, it might not even be the first.
An inconclusive UN report last year suggested that killer robots made their debut in Libya’s internal conflict in 2020, when Turkish-made Kargu-2 drones in fully automatic mode killed an unspecified number of fighters.
A spokesman for STM, the manufacturer, said the report was based on “speculative, unverified” information and “should not be taken seriously.” He told the AP that the Kargu-2 cannot attack a target until the operator tells it to.
The fully autonomous AI is already helping to defend Ukraine. Utah-based Fortem Technologies has supplied the Ukrainian military with drone hunting systems that combine small radars and UAVs, both powered by AI. The radars are designed to identify enemy drones, which the UAVs then disable by shooting nets at them, all without human assistance.
The number of AI-powered drones continues to grow. Israel has been exporting them for decades. Her radar-killing Harpy can hover over anti-aircraft radar for up to nine hours waiting for it to turn on.
Other examples include Beijing’s Blowfish-3 unmanned armed helicopter. Russia has been working on a nuclear-tipped underwater artificial intelligence drone called the Poseidon. The Dutch are currently testing a ground robot with a .50 caliber machine gun.
Honchar believes that Russia, whose attacks on Ukrainian civilians have shown little regard for international law, would have already used killer autonomous drones if the Kremlin had them.
“I don’t think they have any qualms,” agreed Adam Bartosiewicz, vice president of the WB Group, which makes the Warmate.
AI is a priority for Russia. President Vladimir Putin said in 2017 that whoever masters that technology will rule the world. In a December 21 speech, he expressed confidence in the Russian arms industry’s ability to embed AI into war machines, stressing that “the most effective weapons systems are those that operate rapidly and in near-mode.” automatic”. Russian officials already claim that their Lancet drone can operate in complete autonomy.
“It’s not going to be easy to know if and when Russia crosses that line,” said Gregory C. Allen, former director of strategy and policy at the Pentagon’s Joint Center for Artificial Intelligence.
Switching a remote pilot drone to full autonomy may not be noticeable. To date, drones capable of operating in both modes have performed best when piloted by a human, Allen said.
The technology isn’t especially complicated, said University of California-Berkeley professor Stuart Russell, a leading AI researcher. In the mid-2010s, colleagues he surveyed agreed that graduate students could, in a single term, produce an autonomous drone “capable of finding and killing an individual, say, inside a building,” he said. .
An effort to establish international ground rules for military drones has so far been unsuccessful. Nine years of informal United Nations talks in Geneva made little progress, with major powers including the United States and Russia opposing the ban. The last session, in December, ended without a new round scheduled.
Washington lawmakers say they will not agree to a ban because rivals developing drones cannot be trusted to use them ethically.
Toby Walsh, an Australian academic who, like Russell, campaigns against killer robots, hopes to reach a consensus on some limits, including a ban on systems that use facial recognition and other data to identify or attack individuals or categories of individuals. .
“If we’re not careful, they’re going to proliferate much more easily than nuclear weapons,” said Walsh, author of “Machines That Misbehave.” “If you can make a robot kill one person, you can make it kill a thousand.”
Scientists are also concerned that terrorists will reuse AI weapons. In a feared scenario, the US military spends hundreds of millions writing code to power killer drones. It is then stolen and copied, effectively giving the terrorists the same weapon.
The world public is concerned. A 2019 Ipsos poll conducted for Human Rights Watch found that 61% of adults in 26 countries oppose the use of lethal autonomous weapon systems.
To date, the Pentagon has not clearly defined an “autonomous weapon” or authorized a single such weapon for use by US troops, said Allen, a former Defense Department official. Any proposed system must be approved by the chairman of the Joint Chiefs of Staff and two undersecretaries.
That doesn’t stop weapons from being developed in the US There are projects underway at the Defense Advanced Research Projects Agency, military laboratories, academic institutions, and in the private sector.
The Pentagon has emphasized the use of AI to augment human warriors. The Air Force is looking into ways to pair pilots with drone pilots. One proponent of the idea, former Assistant Secretary of Defense Robert O. Work, said in a report last month that “it would be crazy not to go autonomous” once AI-enabled systems surpass humans, a threshold who said he crossed. in 2015, when computer vision eclipsed that of humans.
Humans have already been driven out in some defensive systems. Israel’s Iron Dome anti-missile shield is authorized to open fire automatically, though it is said to be monitored by a person who can intervene if the system goes after the wrong target.
Multiple countries and all branches of the US military are developing drones that can attack in synchronized deadly swarms, according to Kallenborn, the George Mason researcher.
So will future wars become a fight to the last drone?
That’s what Putin predicted in a 2017 televised chat with engineering students: “When one party’s drones are destroyed by another party’s drones, you will have no choice but to surrender.”
Frank Bajak reported from Boston. Associated Press writers Tara Copp in Washington, Garance Burke in San Francisco and Suzan Fraser in Turkey contributed to this report.
Follow AP’s coverage of the war at https://apnews.com/hub/russia-ukraine