© 2024
NPR News, Colorado Stories
Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations
KUNC's The Colorado Dream: Ending the Hate State has arrived! Join us each Monday through Nov. 4 for a new episode.

Foreign Policy: If Drones Had Feelings, They'd Be Hurt

Drones aren't unlawful killing machines, but misunderstood and useful military tools, some say.
iStockphoto.com
Drones aren't unlawful killing machines, but misunderstood and useful military tools, some say.

Charli Carpenter is associate professor of international relations at the University of Massachusetts, Amherst, and blogs about human security at the . Lina Shaikhouni is completing a degree in political science at the University of Massachusetts, Amherst, with an emphasis on human rights and humanitarian law.

Killer robots. Video-game warfare. Unlawful weapons. Terminators. Drone-attack commentary has become synonymous with reports of civilian carnage, claims of international-law violations, and worries about whether high-tech robotic wars have become too easy and fun to be effectively prevented. But the debate over drones is misleading the public about the nature of the weaponry and the law. It is also distracting attention from some more important and bigger issues: whether truly autonomous weapons should be permitted in combat, how to track the human cost of different weapons platforms and promote humanitarian standards in war, and whether targeted killings — by drones or SEAL teams — are lawful means to combat global terrorism. Based on our analysis of recent op-eds, we unpack four sets of misconceptions below and offer some sensible ways for the anti-drone lobby to reframe the debate.

Misconception No. 1: Drones Are "Killer Robots."

This is actually two assumptions; neither is precisely wrong, but both are misleading. First, drones themselves are not necessarily "killers": They are used for many nonlethal purposes as well. Drones (unmanned aerial vehicles) can carry anything ranging from cameras to sensors to weapons and have been deployed for nonlethal purposes such as intelligence gathering and surveillance since the 1950s. Yet the nonlethal applications of drones are often lost in a discussion that treats the technology per se as deadly; 90 percent of the op-eds we analyzed focus solely on drones as killing machines.

Of course, it's true that drones can be used to kill. Some drones over Libya are now armed, and armed drones have been launching strikes in Afghanistan, Pakistan, and Yemen for years. Second, even weaponized drones are not "killer robots," despite the frequent reference in the op-eds we studied to "robotic weapons" or "robotic warfare." Their flight and surveillance systems are able to extract information from their environment and use it to move safely in a purposive manner, but the weapons themselves are controlled by a human operator and are not autonomous. With a human-in-the-loop navigating the aircraft and controlling the weapon, the "killer" aspect of these specific drones may be remote-controlled, but it's not robotic.

This important distinction is easily lost on a concerned public, but the distinction matters. Indeed, the debate over "killer robot drones" that actually aren't autonomous is preventing public attention from being directed to a more ground-breaking development in military technology: preparations to delegate targeting decisions to truly autonomous weapons platforms, many of which are not drones at all. As Brookings Institution scholar , a shift toward fully autonomous weapons systems would represent a sea change in the very nature of war. Groups like the have called for a multilateral discussion to stem or at least regulate these developments. Those worried about drones might usefully refocus their attention to on the debate over whether to keep humans in the loop for unmanned aerial vehicles and other weapons platforms globally. The big issue here is not drones per se. It is the extent to which life-and-death targeting decisions should ever be outsourced to machines.

Misconception No. 2: Drones Make War Easy and Game-Like, and Therefore Likelier.

Remote-controlled violence even with a human in the loop also has people concerned: Nearly 40 percent of the op-eds we studied say that remote-control killing makes war too much like a video game. Many argue this increases the likelihood of armed conflict.

It's a variation on an old argument: Other revolutions in military technology — the longbow, gunpowder, the airplane — have also progressively removed the weapons-bearer from hand-to-hand combat with his foe. Many of these advances, too, were initially criticized for degrading the professional art of war or taking it away from military elites. For example, European aristocrats originally considered the longbow and firearms unchivalrous for a combination of these reasons.

It's true that all killing requires emotional distancing, and militaries throughout time have worked hard to devise ways to ease the psychological impact on soldiers of killing for the state in the national interest. Yet it's not so clear whether the so-called Nintendo effect of drones increases social distance or makes killing easier. Some anecdotal evidence suggests the opposite: Drone pilots say they suffer mental stress precisely because they have detailed, real-time images of their targets, and because they go home to their families afterward rather than debriefing with their units in the field. Studies haven't yet confirmed which view is accurate or whether it's somehow both.

Even if some variant of the Nintendo effect turns out to be real, there is little evidence that distancing soldiers from the battlefield or the act of killing makes war itself more likely rather than less. If that were true, the world would be awash in conflict. As former Lt. Col. Dave Grossman has documented, at no time in history has the combination of technology and military training strategies made killing so easy — a trend that began after World War I. Yet as political scientist Joshua Goldstein demonstrates in a forthcoming book, the incidence of international war — wars between two or more states — has been declining for 70 years.

The political debate over drones should move away from the fear that military advancements mean war is inevitable and instead focus on whether certain weapons and platforms are more or less useful for preventing conflict at a greater or lesser cost to innocent civilian lives. Activists should keep pressure on elected officials, military personnel, and other public institutions to make armed conflict, where it occurs, as bloodless as possible. For example, some human rights groups say the Nintendo effect itself could be harnessed to serve humanitarian outcomes — by embedding war law programming into game designs.

So the wider issue here, too, is not drones. It is about ensuring that a humanitarian code of conduct in war is protected and strengthened.

Misconception No. 3: Drone Strikes Kill Too Many Civilians.

It's hard to argue with this value judgment — in some ways, even one dead civilian is indeed "too many." But it's hard to single out drones when we know so little about whether they kill more or fewer civilians than manned aerial bombing or ground troops would in the same engagements — which also, in some cases, save lives. So a better question than "how many" is: relative to what, and who's counting, how?

Civilians do die in drone attacks, as they do in other types of combat. But accurate reports on drone-strike casualties — and casualties from other types of attack — are very hard to find because no official body is tasked with keeping track. This should change: All collateral damage, not just that caused by drones, needs to be , and minimized by the governments that inflict it.

To demonstrate this wider problem, consider efforts to tally drone deaths. These statistics vary wildly among different sources depending on how sources define who is a militant and who is a civilian. Pakistan Body Count, which keeps a dataset based on news reports, defines all drone deaths as civilians unless the report clearly specifies which terrorist organization the dead belonged to. According to its founder, Pakistani computer scientist Zeeshan-ul-hassan Usmani, the resulting numbers suggest civilians account for 88 percent of all drone-strike deaths in Pakistan since 2004.

But the New America Foundation's similar dataset, complied by analysts Peter Bergen and Katherine Tiedemann, shows drastically different results. They too rely on news reports, but they estimate the civilian fatality rate to be only 32 percent relative to 68 percent militants. Moreover, they claim this percentage is shrinking over time. Unlike Pakistan Body Count, Bergen and Tiedemann code any individuals whose status is unknown as "militants" rather than civilians. A report from the Jamestown Foundation comes up with an even lower number by excluding all men and teenage boys from the "civilian" category — a problematic maneuver from a war law perspective.

An even bigger problem with all these estimates, however, is that they do not measure actual deaths but rather "reported deaths." Journalists, however, are unskilled at distinguishing civilians from combatants. Reports often conflate militants with suspected militants and pool them in the same category, an assumption that discounts civilian casualties. Numbers in media reports are also sometimes vague, leaving it to the discretion of the number crunchers how to interpret them. (Pakistan Body Count translates the term "many civilians" into eight and "several civilians" into four.) Moreover, all these databases rely on news reports and public statements by the governments that are doing the bombing.

Ultimately, the problem here is bigger than drones. It is the absence of a global regime for systematically estimating how many civilians suffer deaths and injuries due to incidental harm from military operations in general. Knowing whether drones constitute the best tool for conducting certain operations requires more than counting drone-strike casualties. The question is not how much collateral damage drones cause, but whether that damage is greater or less than that from aerial attacks by manned aircraft or from ground troops.

Such data is necessary to make the case that drones either are or aren't a suitably discriminate weapon. The world needs a standardized reporting system for tracking civilian and combatant deaths globally in order to really understand the effects of different weapons technologies on civilians. Only then can we have an informed debate about how to minimize war's impact on civilians — while enabling governments to use force when necessary and legitimate. Until then, we're really just guessing.

Misconception No. 4: Drones Violate the International Law of Armed Conflict.

No, they don't — at least, no more so than any other weapons platform when it is used improperly or in the wrong context.

The Hague and Geneva conventions actually place very few restrictions on specific weapons. Nothing in the laws of war, for example, requires that weapons make killing difficult or that they level the playing field. Value judgments aside, the treaties allow for a significant amount of injury and harm both to combatants and civilians. They ask only that harm to combatants be as humane as possible and that harm to noncombatants be minimized.

Weapons have been banned outright when by design they fail on one of these criteria. Chemical weapons and certain types of land mines and cluster munitions are considered to be inherently indiscriminate because they can't be controlled once deployed. Blinding lasers were banned not because they are indiscriminate (quite the opposite) but because international society judged that permanently blinding a soldier or airman constituted superfluous suffering beyond that required by military necessity.

Weaponized drones are not themselves weapons, but rather are platforms for launching air-to-ground kinetic weapons that kill through blasts and explosions. They differ from other types of bombing platforms only in that they are remotely controlled. Although some have argued that explosive weapons used in civilian population areas do not meet the proportionality test — meaning the benefits of their use don't outweigh the humanitarian damage they cause — bombing is currently an accepted practice in international society. It is hard to argue that remotely controlled drone-fired missiles are any more unnecessarily injurious than bombs launched from the air by human pilots.

Military operations inside Pakistan do pose international legal problems, but it's not because of the drones. It's because the United States is technically not at war with Pakistan and because U.S. drone operations in Pakistan are being conducted by the CIA rather than the armed forces. The former violates the U.N. Charter; the latter arguably violates the rules on lawful combat in the Geneva Conventions. These dynamics create legal problems for U.S. military operations in Pakistan whether they are carried out by drones or by SEAL teams on the ground, as in the Abbottabad raid that killed Osama bin Laden. A drone, in short, can be one means by which international law is violated, but it itself is not the source of the violation.

The legal debate over drones needs to refocus on what drones are being used for, not on the nature or effects of the weapons themselves. The real issue is not drones, but the summary execution of suspected criminals without evidence or trial, in complete secrecy, at perhaps an unacceptable cost to innocent lives. Whether this is happening with or without the consent of the Pakistani or Yemeni government is irrelevant. Whether it is being conducted by the CIA or by the U.S. military is irrelevant. Whether it is occurring with remotely piloted drones, manned aircraft, special operations forces, or death squads is irrelevant. What matters is whether extrajudicial execution is or is not the best way to protect citizens against terrorist attacks.

Those who oppose the way drones are used should shift focus to one of the big normative problems touched by the drone issue: the military robotics revolution, collateral-damage control, and the return of extrajudicial execution. Focusing on the drones themselves misses this bigger picture.

Copyright 2020 Foreign Policy. To see more, visit .

Charli Carpenter and Lina Shaikhouni