[Congressional Record (Bound Edition), Volume 157 (2011), Part 10]
[House]
[Pages 14161-14162]
[From the U.S. Government Publishing Office, www.gpo.gov]




                    OPPOSING AUTOMATED KILLER DRONES

  The SPEAKER pro tempore. The Chair recognizes the gentlewoman from 
California (Ms. Woolsey) for 5 minutes.
  Ms. WOOLSEY. Madam Speaker, there was an article in The Washington 
Post earlier this week that we should all find very unsettling and 
disturbing.
  We know that in recent years the Pentagon has increasingly used 
unmanned drone aircraft to carry out violent acts of war. And frankly, 
that's bad enough. But now there's a new and even more frightening 
technology in the works. It's called ``lethal autonomy.'' And under the 
system, the drones would no longer be remotely operated and controlled 
by actual human beings. The lethal autonomy drones would be computer 
programmed to carry out their deadly mission independently. No human 
hand providing steering and guidance.
  I can't even begin to wrap my head around the humanitarian red flags 
associated with this experiment in robotics.
  Software can break down. It could even be hacked. Furthermore, 
computers don't have a conscience. They aren't nimble, they can't make 
snap decisions based on new information or ethical considerations. 
They're programmed to do what they do without judgment, discretion, or 
scruples. You can just imagine, or I can anyway, mass civilian 
atrocities thanks to a robot drone raging out of control.
  Thankfully, a group called the International Committee for Robot Arms 
Control is speaking up and making these points. Pointing out that if we 
have a treaty banning land mines, why not one that outlaws these 
automatic killer drones.
  According to the Post, the military has begun to grapple with the 
implications of this technology. Well, I can really suggest that they 
continue grappling before using these technologies and finding the 
flaws and possible harmful and unpredictable consequences.
  One advocate of these new drones believes it's possible to program 
them to comply with international law regarding the conduct of 
hostilities. Well, I'm certainly skeptical. We couldn't even get the 
last President of the United States to understand and abide by the 
Geneva Conventions. I don't know how we're going to get a robot to do 
it.
  Madam Speaker, the increasing dehumanization of warfare is part of a 
terrifying trend. Somehow it's easier to kill one another when we have 
computers and machines to carry it out for us, when we don't have to 
stare our own mayhem in the face.
  As a member of the Science Committee, I'm totally enthusiastic about 
American high-tech innovation. But I believe we should be using our 
knowledge and ingenuity to give the civilian economy the boost it needs 
to create good jobs for hardworking middle class Americans and to 
create a smarter response to world conflict. All of this money we're 
funneling to defense contractors to devise evermore sophisticated ways 
to kill one another must be reinvested in alternatives to warfare and 
nonviolent ways to resolving conflict.
  That's what my Smart Security plan does. I've discussed this many, 
many times from this very spot. It's called Smart Security. It defines 
military force as the very, very last resort. And it directs energy and 
resources toward diplomacy, democracy promotion, development, and 
peaceful ways of engaging with the rest of the world.
  Madam Speaker, in two weeks' time we will have been at war for a full 
decade. More than 6,000 Americans have died, 10,000 innocent Afghans 
and Iraqis have been killed for the cause of their so-called 
liberation. Many, many more of our own troops have been harmed

[[Page 14162]]

and will always be living with the results of their injuries.
  The time is now. The time is to stop building machines that can kill 
more efficiently and start bringing our troops home.

                          ____________________