When Robots Attack: A Look At 21st Century WarriorsWhen Robots Attack: A Look At 21st Century Warriors

P.W. Singer, author of the new book <i>Wired For War</i>, is concerned about how battlefield robots are changing IT perspectives.

Mitch Wagner, California Bureau Chief, Light Reading

February 13, 2009

8 Min Read
information logo in a gray background | information

P.W. Singer says he hears the Terminator references a lot.

Singer, author of a book about robots being used in wars in real life, on battlefields in Iraq and Afghanistan, says he's often asked whether the robots will rise up to exterminate us.

His response: Maybe. At some point in the future. But between now and then, there's a whole forest of moral, legal, and political issues that we'll need to navigate, and many of those are problems, not in some distant future, but today and in the here and now.

Singer tackles those problems in his new book, Wired For War: The Robotics Revolution And Conflict In The 21st Century. Singer, a senior fellow and director of the 21st Century Defense Initiative at the Brookings Institution, previously wrote the book Corporate Warriors: The Rise Of The Privatized Military Industry, which looked at private companies providing military services for hire. That book was published in 2003, before the use of those companies becoming an issue in Iraq. Following that book, he wrote Children At War, about children's armies. Singer served as coordinator of the Obama 2008 campaign's defense policy task force.

Robots are proliferating on the battlefields of Iraq and Afghanistan, Singer said in a phone interview with information.

"I think the extent of it would catch people by surprise," he said.

Unmanned drone air systems went from just a handful when he turned in his manuscript in mid-2008 to 5,300 six months later, to 7,000 today. On the ground, the military now has 12,000 robots deployed tactically.

The robots are currently in use primarily for reconnaissance and defensive purposes -- for example, to disable explosive devices or to fly above a battlefield and relay information to ground-based observers. However, they're also starting to be used to fight back against enemies, and scientists are working on robots capable of delivering greater and greater lethal force.

"These are just the first generation, the Model T Fords," he said. "The sort of things we only talked about in science fiction conventions really need to be talked about in the real world."

Society's leaders -- including the voting general public -- need to stay on top of these issues so that they can make informed policy decisions, Singer said. Even scientists working on robotics are often unaware of the use to which their inventions are being put. Singer spoke at the TED conference this month, along with a scientist developing robots, who spoke about how wonderful robots are and their potential to develop into a new species. The scientist singled out BigDog, a robot capable of walking on four legs on rough terrain, carrying cargo like a pack mule.

"But who's paying for BigDog?" Singer said. "It's not being built for the betterment of humanity; it's being built for the betterment of humanity's ability to kill."

Robots will change the definition of what it means to go to war. "My grandfather went to war in World War II in the Pacific, and that meant he went far away, to a dangerous place, and his family didn't know where he was and if he was coming back," Singer said. By contrast, Predator drone pilots work in trailers in Nevada, and they commute to work in Toyotas. They deal death for a 12-hour shift in a battlefield in Iraq, and then they get in the car and go home and 30 minutes later they're having dinner with their families.

Robot operators have higher levels of post-traumatic stress disorder than many units in Iraq have, Singer said. That's counter-intuitive -- you might expect that a soldier who's been ripped from his family, taken thousands of miles away, and put in a dangerous situation, would have more stress than a soldier who gets to stay home with his family and is never in physical danger himself. However, the drone operators suffer from a disconnect. Like bomber pilots, they deal death at a distance, but unlike bomber pilots they stay around and witness the consequences. They see the enemy dying at close range -- they sometimes see fellow Americans dying and are unable to do anything about it. And, on top of that, they have to deal with the stresses of home and family life.

"When you're deployed, everyone is focused on the mission, your wife doesn't get mad at 7 p.m. because you were late for Timmy's soccer practice," Singer said.

To help reduce stress and improve effectiveness, some squadron commanders are requiring operators to take measures to separate their combat experience from normal life. The operators wear their flight suits into the facility, they cut off personal communications, and, on longer missions, the squadron stays together around the clock, like a Super Bowl team staying in a hotel before the game even if they're playing the game at home.

The upside of fighting war with robots is that it can reduce the cost of war, both in lives and societal disruption at home. But that's also its downside. Even conservatives, including a former senior adviser to President Reagan, worry that the cost of war will become too cheap, making wars too easy to start, Singer said.

The relationship between the public and military is changing, war is becoming costless to civilians, with no draft, no declarations of war, and nobody even buying war bonds, Singer said. War still has an enormous impact on war-fighters and their families and friends, but as human war-fighters get replaced by robots, that cost will diminish further.

The connection between society at large and war is complicated by video. Robots take video of their actions, and thousands of hours of that video finds its way to YouTube. That has the potential to be a benefit, because it creates an information channel between the military on the battlefield and the civilian population at home that's unmediated by the news media. On the other hand, it has the possibility of widening the disconnect, as people at home view the videos as entertainment, rather than images of real people suffering and dying, Singers said.

Singer described one e-mail with the subject line "Watch this," and an attached video of a Predator drone striking, an explosion, and bodies flying in the air, set to the music of the Sugar Ray song, "I Just Want To Fly." "People are turning war into a joke," he said.

But is abuse of combat robots inevitable?

"We can't resist ourselves. We always open Pandora's box; that's the nature of science and technology itself," Singer said. Fortunately, the human race has proven itself able to restrict harmful military technology, like poison gas, chemical and biological warfare, and nuclear weapons. "The darker side is we usually don't get our butts into action until the bad thing happens. We don't have international law until we've had the Thirty Years War, we don't get the Geneva Conventions until we have the Holocaust, we don't get to international land mine conventions until we've buried 25 million land mines under the Earth."

And even when the overwhelming majority of the world refrains from using terrible weapons, the lunatic fringe remains a threat.

So what about the question we started this article with? Will robots develop strong AI -- artificial intelligence at the human level or better -- and rise up to kill or enslave us all?

"The people that I spoke with in the field did take very seriously the idea that we would one day achieve 'strong AI' and that it would represent a break point in history, a 'singularity,' akin to the printing press or atomic bomb, where the old rules have to be re-evaluated and new questions about what is possible and proper have to asked," Singer said in an e-mail.

The people taking the possibility of strong AI seriously weren't just visionaries like Raymond Kurzweil.

"I even had a special operations officer just back from hunting terrorists in Iraq talk about this in an interview," Singer said.

But we can't predict what intelligent robots' relationship to the human race might be. They might be a threat, like the Terminator. But not necessarily. Some say robots would require a survival instinct to threaten human beings, and researchers are intentionally leaving the survival instinct out of their machines. Others say that artificial intelligences might be more moral than human beings -- more like Mr. Data from Star Trek than like the Terminator, Singer said.

"Others joke that just about the time the AI is ready to take over, their Microsoft Vista will crash," Singer said.

He added, "My sense is that there are certainly questions of ethics and control, but they are already happening now, well before we get to strong AI. That is, there are enough questions that arise with the predators and packbots of today, that we don't need to jump to the future to know that something both interesting and a bit scary is going on."

Read more about:

20092009

About the Author

Mitch Wagner

California Bureau Chief, Light Reading

Mitch Wagner is California bureau chief for Light Reading.

Never Miss a Beat: Get a snapshot of the issues affecting the IT industry straight to your inbox.

You May Also Like


More Insights