This article explores the role of robots in military operations in the coming future and challenges posed by it. Military personnel believe that robots may not replace soldiers; however, as automated systems become more capable, they will certainly supplement human troops. On the ground, soldiers deploy robots to search dangerous locations, and to find and defuse roadside bombs. New models that carry weapons have already entered service. Robotic warfare is open-source warfare; enemies cannot only probe robots for technological vulnerabilities, but they can also build robots themselves. Some commanders worry that distancing soldiers from the results of their actions makes it easier to commit war crimes. Robot-based warfare might well change the types of skills the military looks for in future soldiers. While it takes years to train an F-15 pilot, drone pilots achieve comparable results with just months of training, and they pilot far cheaper aircraft. The development of robot-based warfare has been a great asset to the American military. However, there is a profound risk inherent in this technology should some other force discover a better way to employ robot warriors.
General James Mattis was walking the perimeter of a Marine camp in Iraq after a difficult day when he noticed six Marines drawn up in formation with a small American flag. He walked over. They were an explosive ordinance disposal team, they told him, and they were burying the remains of a robot used to disarm bombs.
“They were giving it a full military honors funeral,” Mattis recounted months later. “They said it took six wounds. ‘We were able to put it back together six times, but it was blown up, and this time, as you can see, there's nothing left.”’
Was this some sort of stunt or prank on the part of young men under enormous pressure?
“It was probably less humorous than it may sound here today,” Mattis told his audience. “That robot had saved their lives. It had crawled up next to bombs how many times and they had actually developed a fondness that oftentimes you develop for your shipmates when you’re in tough times,” Mattis said.
Robots are going to play a larger and larger role in future military operations, Mattis added. He should know: As head of U.S. Joint Forces Command and NATO Transformation, he is helping to define that role.
The numbers support Mattis's contention. When the United States invaded Iraq in March 2003, its robot force consisted of a handful of drones, mostly prototypes. Most soldiers, from commanders down to the lowliest privates, were skeptical of the robots’ place on the battlefield.
Today, the United States fields more than 5,300 unmanned aerial vehicles and more than 12,000 ground robots. The military has embraced them as a way to reduce risk to soldiers, gather intelligence, and strike stealthily at remote enemies.
Yet using robots in this way poses subtle risks, some which may be hard to identify at first. Will a war that promises little sacrifice and few military casualties make force a more attractive option for policymakers? Will it alter what it means to be a “citizen-soldier”? Can reliance—or over-reliance—on robotics reduce our flexibility when confronted with enemies who adapt to the new technology? And, perhaps most chillingly, can we guarantee that terrorists won’t build their own robots the way they now hack computers?
Open Source Warfare
Today's military wisdom posits that technology alone cannot win urban conflicts and insurgencies such as those in Iraq, Afghanistan, and Pakistan. Instead, that kind of conflict must be fought with infantry, the socalled boots on the ground. Yet the growing military investment in robotic technology—by the United States as well as by more than 40 other nations—suggests that robots are rapidly becoming an important piece of tomorrow's military arsenal. Israel and South Korea, for instance, already use armed robots to patrol their borders. Robots may not replace soldiers, but as automated systems become more capable, they will certainly supplement human troops.
And robots are indeed becoming more capable, though soldiers still make most critical decisions. Operators in cubicles in the United States (and apparently Pakistan) routinely fly drone aircraft via remote control, monitoring and attacking potential targets. Military personnel credit drone attacks with disrupting Al Qaeda's leadership along the hard-to-reach Afghanistan-Pakistan border.
On the ground, soldiers deploy robots to search dangerous locations, and to find and defuse roadside bombs. New models that carry weapons have already entered service.
As impressive as this sounds, the current robot generation is the technological equivalent of the Model T and the Wright Flyer. Robots will only grow smarter, faster, more lethal—and more autonomous. Some military thinkers envision future conflicts that involve tens of thousands of armed autonomous and remote-controlled robots.
Peter Singer, the director of the 21st Century Defense Initiative and a senior fellow in foreign policy at the Brookings Institution in Washington, D.C., wrestled with the risk posed by robots in his latest book, Wired for War. Singer studies the ethically disturbing nature of modern warfare; his previous books dealt with military services companies and the use of child soldiers. His latest raises equally pressing concerns.
“Robotic warfare is open-source warfare,” Singer said. Enemies can not only probe robots for technological vulnerabilities, but they can also build robots themselves.
“It is made all the easier by the fact that so much of this technology may be revolutionary, but it is also highly commercialized. For example, with $1,000 you can go out and build a drone that has roughly the same capabilities that the Raven drones that our soldiers use in Iraq have,” Singer said.
The Raven is a small drone used as a scout by ground troops. A couple of years ago, Singer said, Wired magazine's editor, Chris Anderson, and some friends assembled a cheap drone from off-the-shelf materials that was similar to the ones deployed in Iraq just a few years earlier.
Others have done the same, often in unlikely places. During the Israeli invasion of Lebanon in 2006, Hezbollah fighters fielded at least four drones. Singer believes the combination of robotics and terrorism will empower individuals at the expense of states and make it easier to launch terrorist attacks. “You don’t have to promise 72 virgins to a robot to convince it to blow itself up,” Singer remarked in Wired last year.
Robot suicides are not the scariest terror scenarios. While researching his book, Singer came upon an Air Force study that found that drones were the optimal platform for deploying weapons of mass destruction—and one against which we have no effective defenses. He also spoke with a Pentagon scientist who claimed he could shut down Manhattan with a budget of $50,000. Although Singer refused to provide details, he called the scenario “pretty scary and very real.”
Singer also worries about how America will weigh going to war when it can put robots rather than citizens in harm's way. “Many of the people whom I interviewed worried that it was going to make us more cavalier about the use of force,” he said. “As one Reagan-era assistant defense secretary put it, ‘It is going to return us to thecruise missile diplomacy of the 1990s. We will have more Kosovos and less Iraqs.”’This tendency could be as pronounced among doves as hawks. Last winter, for example, the Washington Post argued for using unmanned vehicles to intervene in the genocide in the Darfur region of Sudan. In fact, a private military contractor offered to rent a drone to a group of college students who had raised money to do something about Darfur, Singer said.
How likely are remote interventions when there is no draft, no shared sacrifice, and less chance of American casualties? “You take the already lowering bars to war and you drive them to the ground,” Singer said.
He worries that, rather than being seen through the memories, wounds, and losses of those who fought, war will morph into entertainment. There are more than 7,000 clips of combat footage, mostly from drones, circulating on the Internet. People e-mail them to one another, or set them to music. “Soldiers call this ‘war porn,”’ he said.
Policy makers assume American technology strikes fear in enemy hearts. But overseas, the perception is different. Hearing a drone buzzing overhead, a Lebanese news editor told Singer that Americans were cowards “because they send out machines to fight us. They don’t want to fight us like real men. So all we have to do is just kill a few of their soldiers to defeat them.”
In a May 2009 opinion piece in the New York Times, David Kilcullen, a former adviser to General David Petraeus, and Andrew Exum, an Army officer in Iraq and Afghanistan and now a fellow at the Center for a New American Security, argued that drone strikes are often counterproductive.
Many targets, they point out, are not positively identified and air strikes are not always precise. Those kinds of errors lead to attacks that kill more civilians than terrorists—and thus drive civilians into the arms of local militants. Moreover, coverage via the Internet as well as traditional media brings the horror of civilian deaths home to everyone in the region, further radicalizing the population.
Concern over the unintended consequences of drone strikes was one of the reasons the Obama administration appointed General Stanley McChrystal to be commander in Afghanistan, replacing General David McKiernan before McKiernan completed his tour of duty. McChrystal acted quickly to raise the bar for intelligence before launching drone attacks.
General Mattis sees a moral hazard in a robot-only presence. “From a Marine's point of view, we cannot lose our honor by failing to put our own skin on the line to protect the realm,” he said. “And the realm today isn’t just geographic; the realm is the ideas, the concepts that grew out of the Enlightenment.”
Yet robots are already changing the military. Many drone pilots operate out of trailers at a military base in Nevada. Many talk about the disconnect of working with virtual partners they may never meet to direct kills against enemy combatants that exist only on their video monitors.
Those same monitors, however, may show the carnage they inflicted, from the wounded writhing on the ground after an attack to families burning in buildings engulfed in flames.
After a shift, these same pilots drive home and sit around the dinner table helping their children with homework.
Some reports suggest that drone pilots suffer from posttraumatic stress disorder at rates greater than other soldiers. Perhaps this “disconnect” is part of the reason: It may be harder for a soldier to justify killing when his or her life is not at risk.
Some commanders worry that distancing soldiers from the results of their actions makes it easier to commit war crimes. Others, such as Georgia Tech computer scientist Ronald C. Arkin, who designs software for Army combat robots, believe robots could behave more ethically than humans on a battlefield. Because autonomous robots have no instinct for self-preservation, Arkin has argued, they are more likely to follow predefined rules of engagement rather than lash out from fear or anger.
Robot-based warfare might well change the types of skills the military looks for in future soldiers. While it takes years to train an F-15 pilot, drone pilots achieve comparable results with just months of training, and they pilot far cheaper aircraft.
In fact, one of the Army's top drone pilots was a 19-year-old high school dropout. Like many of his colleagues, he joined the service with few traditionally useful skills. He was, however, an experienced video gamer, and parlayed those skills into an assignment.
What does that say about future recruits? As one interviewee told Singer about the soldiers of tomorrow, “Having a strong bladder and a big butt may turn out to be more useful physical attributes than being able to do 100 pushups.”
It is a disruptive and unsettling picture. This is why policy makers have already begun wrestling with the threats and opportunities—both military and moral—of robotic technologies.
Militaries are conservative cultures that only transform themselves if they face a problem, General Mattis noted. The outlines of the robotics problem are already visible: What will an armed conflict with tens of thousands of UAVs and ground-based robots look like? What about a robot-delivered bomb in the hands of a terrorist?
The military tries to find new strategies to tackle disruptive changes through discussions, models, and war games. In the years prior to World War II, for instance, the U.S. Navy sought to find a way to get across the Pacific Ocean to counter-attack Japan in the event of war.
“As they lost each war game time after time, they found they could win when they added things like aircraft carriers, amphibious troops, and logistics trains,” Mattis said. Those exercises guided military strategy when war broke out in the Pacific.
Today, the United States is learning from its experiences in Iraq and Afghanistan. Strategies have evolved rapidly. Yet even though the United States leads the world in military robots, it cannot rest on its accomplishments.
“In technology, there is no long-term first-mover advantage. If you don’t believe me, how many in this room still use Wang computers?” Singer asked an audience at Brookings last year, referring to the manufacturer of the first word processor. “The British and the French invented the tank in World War I, but the Germans learned how to use it right.”
The development of robot-based warfare has been, up to now, a great asset to the American military. But there is a profound risk inherent in this technology should some other force discover a better way to employ robot warriors.