Back to Magazine
Homepage

 

Programmed for war

 

Automated and robotic weapons are just one example of how technology is changing the way war is being waged. Are humanitarians and the rules of war keeping up?

IN MAY, 2013, A BAT-WINGED, unmanned craft the size of a standard fighter jet made its first flight from the deck of the aircraft carrier USS George H.W. Bush, just off the coast of the United States near Washington DC.

Known as the X-47B, the drone’s large size allows it to fly a much longer range than the better-known Predator drones now in use, and its ability to take off from a ship means it can be used almost anywhere in the world.

But there was something else that made the flight unique, even historic. The X-47B, according to the United States Navy, is designed so that it “can be programmed to carry out missions with no human intervention”. Unlike the drones currently in use, this weapon can be automated. In essence, it’s a robot with wings, guns and bombs.

“This is the way of the future,” United States Navy Rear admiral Mat Winter was quoted as saying by the Associated Press.

And the X-47B is not the only weapon on the drawing board. Numerous countries, with large and small militaries, are developing similar flying weapons systems that can be controlled remotely (similar to the drones currently in use) but that can also function autonomously.

High-speed conflict

From the military point of view, there are many advantages. Fighter drones could fly into defended airspace without putting pilots at risk and could manoeuvre more quickly, taking sharp turns that would injure or kill a human pilot. They can fly faster, longer and higher than traditional fighter jets and those that are pre-programmed, or automated, would be able to continue a mission even if communication between the drone and the command centre is interrupted.

Meanwhile, a similar revolution is taking place on the ground. In the last 15 years, thousands of robots have been deployed in conflicts such as Iraq and Afghanistan. Most have been used to detonate improvised explosive devices but in 2007, a robot modified to carry weapons was tested in Iraq.

Since then, China, Israel and Russia have also developed weaponized ground robot systems and other countries are following suit. They come in all shapes and sizes: some are only slightly larger than a remote-control toy, others are the size of large trucks. Usually fitted with tank-like treads or large wheels, many feature arms capable of simple tasks, manoeuvrable video cameras, infrared or night vision capabilities and weapons.

Their missions are manifold. They can enter buildings or territory occupied by enemy combatants, for reconnaissance or attack, and most of these systems are operated by remote control. In time, experts predict, ground robots could also be programmed for autonomous missions.

According to many experts, the advances being made today in artificial intelligence represent a quantum leap in warfare technology, similar to the advent of aviation in the first half of the 20th century. But this time, it’s not only countries with large militaries involved.

“Today we are definitely seeing a wide range of actors with access to new advanced technology, particularly as it becomes cheaper and simpler to use,” writes Peter W. Singer, director of the Center for 21st Century Security and Intelligence at the Brookings Institution in Washington DC, in a recent issue of the International Review of the Red Cross.

“When the point is reached where a micro-drone can be flown using an iPhone application — which is possible now — then suddenly a lot of people can use it,” says Singer, who is also the author of Wired for War: The Robotics Revolution and Conflict in the 21st Century.

Mechanized distinction

All this has serious implications for the way conflicts and the international balance of power may evolve. Some, such as UK-based computer scientist and robotics expert Noel Sharkey, worry that we are at the cusp of a new kind of arms race in which the weapons in question are relatively small, cheap, easy to produce but extremely difficult to regulate. “Everybody will have this technology,” says Sharkey, noting that robotics technology is being driven as much by consumer and industrial markets as by military budgets.

This is why Sharkey is opposed to weapons systems that are not under human control at all times and he feels new treaty law in the best way to ensure that. Several states have issued policy statements saying humans will always be in the loop when weapons capable of autonomous action are deployed. “But what does that mean?” asks Sharkey. “Does that mean someone pressing a button and after that the machine takes over?”

Fundamentally, Sharkey contends, it’s not a legal question, but one about our essential humanity. “We cannot delegate the decision to kill to a machine. It’s the ultimate indignity to have a machine decide to kill you.”

For humanitarians, robotic, automated or fully autonomous weapon systems also poses serious questions: As more of the targeting and firing functions of these machines are automated, will these highly efficient killing machines be able to make the necessary distinctions between combatants and military targets on the one hand and civilians on the other?

If, as some predict, automated, hypersonic warplanes dramatically speed up the pace of conflict, will humans be able to make sound decisions about targeting and protecting civilians given the lightning-fast pace of next-generation combat? Or will those decisions also become automated?

And if an autonomous or automated weapon does
commit a violation of the rules of war who will be held
responsible? The commander who sent the drone or
robot into battle or the manufacturer of the software
that runs the robot?

These questions are being hotly debated in academic, military and peace advocacy circles. While some are calling for regulation, new treaty law and even moratoriums and bans on such weapons, the ICRC is calling on states to live up to their obligations under the Geneva Conventions and their Additional Protocols to ensure all new weapons systems would comply with international humanitarian law (IHL) before they are developed and deployed.

There are already many legal, moral and political questions revolving around drones in use today, most notably, by the United States to carry out strikes in Afghanistan, Pakistan and Yemen. But most of the questions concerning IHL and current drone missions have to do with the way those weapons are used, not the technology itself. The key element being that today, human beings are still in active control of the drones during their missions, albeit from a location far from the battlefield.

With autonomous weapons, the legal equation has shifted and debate is more closely centred on the technology and its capabilities. “Such a weapon would have to be able to distinguish not only between combatants and civilians, but also, for instance, between active combatants and those hors de combat, and between civilians taking a direct part in hostilities and armed civilians,” according to the ICRC.

An autonomous weapon would also have to comply with the rule of proportionality, which requires that the incidental civilian casualties expected from an attack on a military target not be excessive when weighed against the anticipated concrete and direct military advantage. And when attacking, it would have to be capable of taking precautions to minimize civilian casualties.

For Sharkey, the technology that would allow computers to make such distinctions and precautions is still far from reality. “If you had a perfectly clear environment, such as an open desert with a tank in it, you might be able to get it to distinguish the shape of the tank and attack it,’’ he says.

But even in modestly complex environments, such as a village centre or residential street, computers are simply not able to distinguish between multiple, changing shapes in a landscape cluttered with buildings, cars, trees and people, he says.

With weapons systems that are automated or semi-autonomous (i.e. programmed to carry out a series of specific, pre-programmed attacks) the questions are different. In this case, a human has made the targeting decisions. But what if the situation changes — a school bus suddenly pulls in front of the target once the mission is launched? The systems might allow for a human over-ride but if communication with the weapon is jammed by enemy forces (a normal occurrence in warfare) there would be no turning back.

Some IHL experts, however, counter that such circumstances can already arise with some non-autonomous weapons already in use. When a long-range cruise missile is fired, for example, the situation on the ground may change dramatically between the time the missile is launched and the time it strikes its target.

Loss of humanity?

Indeed, not all experts on robotics and IHL are convinced that automation or autonomy in weapons systems always contradict humanitarian values. As artificial intelligence improves, some contend that a robot could theoretically be programmed to behave — in a sense — more humanely than humans, particularly in high-stress, emotionally charged battlefield environments.

As this level of automation is still the stuff of science fantasy, a more concrete and immediate example are defensive missile systems, already in use to identify, target and shoot down incoming missiles at speeds beyond the capability of human operators. Would it be fair, some experts ask, to prevent a state from using automation to defend people from a barrage of incoming rockets?

The chicken and the egg

In practical terms, however, it’s not likely that states will agree on treaty law regulating this emerging technology any time soon, says William Boothby, an expert on the process of state reviews of new weapons vis-à-vis international humanitarian law.

One reason is that militaries are generally not inclined to reveal their true technological capability in order to keep the upper hand in future conflict. ‘Part of a perceived advantage ebbs away if others become aware of the weapon and the way it works,’ notes Boothby, author of the recently published book Conflict Law, the influence of new weapons technology, human rights and emerging actors.

‘So it’s a chicken-and-egg scenario,’ he adds. ‘Which states are going to legislate something the characteristics of which we don’t yet know? It’s difficult to evaluate the risks and opportunities associated with something that has not achieved a certain level of maturity.’’

To Boothby, this is why it’s critical that states enhance their capacity to conduct legal reviews of each and every new weapons systems as already called for by treaty. “Of the 170 or so states that are treaty bound to conduct new weapons reviews, only about a dozen are known to have a regular process for systematically doing so,’’ he says. Boothby acknowledges that even when reviews are undertaken this system is not perfect, notably that it’s the states themselves that evaluate their own weapons systems. But he argues it’s an important and necessary step.

Whatever one’s position on robotic weapons, greater attention from the humanitarian sector is overdue, argues weapons expert Peter Singer, adding that when he first started talking with humanitarian organizations about new technology, “none of them [was] ready or willing to talk about technologies like the Predator” drone.

“The same phenomenon is playing out right now with the current development of technology,” he argues in his recent Review article. “The humanitarian community is ex post facto reacting to things that already exist and are being used. And thus its impact will be less because the community did not weigh in until it was already behind the curve.”

One reason may be that humanitarian organizations have remained extremely busy contending with current-day atrocities and violations, many committed with low-tech, conventional weapons — from machetes to automatic rifles.

On a deeper level, as Singer notes, all this poses questions that go beyond international humanitarian law: “The bottom-line question is, is it our machines that are wired for war, or is it us humans that are actually the ones wired for war?”

By Malcolm Lucard
Malcolm Lucard is editor of Red Cross Red Crescent magazine.


The experimental British-made Taranis stealth fighter drone taxies on a runway during tests in England in 2013. The Taranis will be programmed to be able to evade attack and select targets but the manufacturer and the British government insist the Taranis is designed to be flown by human operators and that targets will always be verified by the human operator before any attack is launched.
Photo: ©Ray Troll/BAE Systems

 

 

 

 

 


Weapons that function autonomously are not new. Landmines function without human intervention.
Photo: ©REUTERS/Nita Bhallia

 


 

 


More sophisticated machine gun
systems used to guard border areas or sensitive facilities, can locate targets and fire without direct human control.
Photo: ©REUTERS/Pichi Chuang

 

 

 

 

 

 

Will these highly
efficient killing
machines be able to
make the necessary distinctions between
combatants and
military targets on
the one hand and
civilians on the other?

 

 

 

 


Defensive missile systems are automated to make high-speed targeting decisions.
Photo: ©REUTERS/Darren Whiteside


 

 

 

 


Weapons being developed today take many forms. The US Navy’s X-47B pilotless, stealth combat aircraft is able to carry out pre-programmed missions.
Photo: ©REUTERS/Rich-Joseph Facun

 

 

 

 

 


While experts say insect-sized ‘nano-drones’ could be programmed to carry out missions and react to conditions that arise in the field.
Photo: ©REUTERS/Skip Peterson

 

 

 

 

 

 


An American soldier looks at an armed, robotic vehicle known as MAARS, or Modular Advanced Armed Robotic System, at a military exposition at a United States Marine base in California, USA in 2012.
Photo: ©REUTERS/Mike Blake

 

 

 

 

 


Stealth fighter drones have been under development for many years by numerous countries. Here a cameraman films a model of a proposed Chinese unmanned combat aerial vehicle, nicknamed ‘Anjian’ or ‘DarkSword’.
Photo: ©REUTERS/Bobby Yip

 

 

 

 

 

150 years of Humanitarian action

As the first Geneva Convention turns 150 years old this year, RCRC magazine explores the future of international humanitarian law and the implications that new weapons and aid technology will have on humanitarian action and the rules of war.

 

 

Top

Contact Us

Credits

Webmaster

©2014

Copyright