The real issue is the context of how drones kill. The curious characteristic of drones—and the names reinforce this—is that they are used primarily to target individual humans, not places or military forces as such. Yet they simultaneously obscure the human role in perpetrating the violence.
Unlike a missile strike, in which a physical or geographic target is chosen beforehand, drones linger, looking precisely for a target—a human target. And yet, at the same time, the perpetrator of the violence is not physically present. Observers are drawn toward thinking that it is the Predator that kills Anwar al-Awlaki, or its Hellfire missiles, not the CIA officers who order the weapons’ engagement. On the one hand, we have the most intimate form of violence—the targeted killing of a specific person, which in some contexts is called assassination—while on the other hand, the least intimate of weapons.
DRONES: A WORD FOR YOU – NO JOKE –THEY PROVOKE COUNTER ATTACKS SEEN AS LEGITIMATE DEFENSE
by Joh Sifton
What, in the final analysis, is troubling about the CIA’s use of drones? Drones are only one weapon system among many, and the CIA’s role, while disturbing, is not the primary cause for alarm. Certainly the legal identity of drone operators, CIA or military, matters little to the victims of a Hellfire strike. So what is it about the drone, really, that draws the attention of victims, insurgent propagandists, lawyers and journalists, more than other forms of kinetic violent force? Why do drones interest us, fascinate us or disturb us?
Perhaps one clue comes from the linguistics. The weapons’ names suggest ruthless and inhumane characteristics. The first drone aircraft deployed by the CIA and Air Force after 2001 was the Predator, a rather coarse name even for a weapons system, suggestive that the enemy was not human but merely prey, that military operations were not combat subject to the laws of war but a hunt.
(Some of the computer software used by the military and the CIA to calculate expected civilian casualties during airstrikes is known in government circles as Bug Splat.)
The Predator’s manufacturer, General Atomics, later developed the larger Reaper, a moniker implying that the United States was fate itself, cutting down enemies who were destined to die. That the drones’ payloads were called Hellfire missiles, invoking the punishment of the afterlife, added to a sense of righteousness.
But the real issue is the context of how drones kill. The curious characteristic of drones—and the names reinforce this—is that they are used primarily to target individual humans, not places or military forces as such. Yet they simultaneously obscure the human role in perpetrating the violence. Unlike a missile strike, in which a physical or geographic target is chosen beforehand, drones linger, looking precisely for a target—a human target. And yet, at the same time, the perpetrator of the violence is not physically present. Observers are drawn toward thinking that it is the Predator that kills Anwar al-Awlaki, or its Hellfire missiles, not the CIA officers who order the weapons’ engagement. On the one hand, we have the most intimate form of violence—the targeted killing of a specific person, which in some contexts is called assassination—while on the other hand, the least intimate of weapons.
This characteristic, the distance between targets and CIA executive officers at Langley, is the defining characteristic of drones. They are the zenith of the technological quest that runs back to the invention of slings and arrows thousands of years ago, efforts of the earliest perpetrators of violence to get away from their victims. That process, which brought catapults and later artillery, reached its first peak with the development of intercontinental nuclear missiles; but those are weapons of limited tactical use and have never been used. Drones allow all the alienation of long-range missions but with much more flexibility and capacity for everyday use. The net result is everyday violence with all the distance and alienation of ICBMs. This is disturbing perhaps because alienation is disturbing.
The work of animal behaviorists like Konrad Lorenz sheds some light on why. Lorenz—a onetime member of the Nazi party who later renounced his politics and won the Nobel Prize in the 1970s—spent much of his life studying violence in animals. His book On Aggression posited a theory whereby many animals, male and female, have a natural “drive” to be aggressive against opponents, including members of their own species.
The aggression drive, Lorenz posited, was often limited within species by a “submission” phenomenon, whereby potential victims turn off the aggressive drive in others by displaying signs of submission. In this way, most animal violence is checked before it occurs. Lorenz suggested that in humans, the submission safety valve was blunted by the technological creation of weapons, which emotionally “distanced” the killer from his victim. When a spear or sling is used to kill, victims lose the opportunity to engage in submission and trigger the aggression “off switch.” The drone represents an extreme extension of that process. Drones crossed into a new frontier in military affairs: an area of entirely risk-free, remote and even potentially automated killing detached from human behavioral cues.
Military research seems to back this up. Lt. Col. Dave Grossman, a psychologist and former professor at West Point, has written extensively on the natural human aversion to killing. His 1995 book On Killing contains a collection of accounts from his research and from military history demonstrating soldiers’ revulsion with killing—in particular, killing at close range. He tells the story of a Green Beret in Vietnam describing the killing of a young Vietnamese soldier: “I just opened up, fired the whole twenty rounds right at the kid, and he just laid there. I dropped my weapon and cried.” The most telling accounts are with the “close” kills of hand-to-hand combat. Grossman tells of a Special Forces sergeant from the Vietnam War describing a close kill: “‘When you get up close and personal,’ he drawled with a cud of chewing tobacco in his cheek, ‘where you can hear ‘em scream and see ‘em die,’ and here he spit tobacco for emphasis, ‘it’s a bitch.’”
Obviously the primary advantage of the drone is that it insulates its operators from risk. Yet one can’t help wondering whether aversion to the unpleasantness of violence is another factor making drones popular with the military and CIA. Drones make the nasty business of killing a little easier. Or do they?
There are reports of military drone operators suffering from post-traumatic stress disorder, and studies showing that those who conduct strikes or watch videos of strikes suffer from “operational stress,” which officials believe is the result of operators’ long hours and extended viewing of video feeds showing the results of military operations after they have occurred—i.e., dead bodies. Still, these reports pale in comparison with those of PTSD among combat veterans. And there is no public information about stress among those ordering the strikes—the CIA strike operators or the decision-makers at Langley.
A little-noticed 2011 British Defense Ministry study of unmanned drones discusses some of these points: from concerns about drone operators’ potential alienation from violence to the propaganda opportunities for enemies (noting that drones’ use “enables the insurgent to cast himself in the role of underdog and the West as a cowardly bully—that is unwilling to risk his own troops, but is happy to kill remotely”).
The paper also discusses concerns raised by military analyst Peter Singer, who has written on “robot warfare” and the risk that drones might acquire the capacity to engage enemies autonomously. The report envisions a scenario where a drone fires on a target “based solely on its own sensors, or shared information, and without recourse to higher, human authority.”
The authors note that in warfare, the risks of the battlefield and the horror that comes from carrying out violence can act as controls on brutality. Citing the oft-quoted adage of Gen. Robert E. Lee, reportedly uttered after the battle of Fredericksburg, “It is well that war is so terrible, otherwise we would grow too fond of it,” the authors then ask:
If we remove the risk of loss from the decision-makers’ calculations when considering crisis management options, do we make the use of armed force more attractive? Will decision-makers resort to war as a policy option far sooner than previously?
The issue is not that armed drones are more terrible or deadly than other weapons systems. On the contrary, the violence of drones today is more selective than many forms of military violence, and human rights groups recognize that drones, in comparison with less precise weapons, have the potential to minimize civilian casualties during legitimate military strikes.
Nor is the issue the remote delivery of weapons: alienation from the effects of violence reached a high-water mark in World War I. What makes drones disturbing is an unusual combination of characteristics: the distance between killer and killed, the asymmetry, the prospect of automation and, most of all, the minimization of pilot risk and political risk.
It is the merging of these characteristics that draws the attention of journalists, military analysts, human rights researchers and Al Qaeda propagandists, suggesting something disturbing about what human violence may become. The unique technology allows the mundane and regular violence of military force to be separated further from human emotion.
Drones foreshadow the idea that brutality could become detached from humanity—and yield violence that is, as it were, unconscious.
In this sense, drones foretell a future that is very dark indeed.
John Sifton, is the Asia Advocacy Director, and works on South and Southeast Asia. Previously, he was the director of One World Research. Sifton spent six years at Human Rights Watch, first as a researcher in the Asia division, focusing on Afghanistan and Pakistan, and then as the senior researcher on terrorism and counterterrorism.
Disclaimer: The views expressed in this article are the sole responsibility of the author and do not necessarily reflect those of the ‘Wonders of Pakistan’. The contents of this article too are the sole responsibility of the author (s). WoP will not be responsible or liable for any inaccurate or incorrect statement / s contained in this post.
YOUR COMMENT IS IMPORTANT
DO NOT UNDERESTIMATE THE POWER OF YOUR COMMENT