Somewhere, Hollywood director James Cameron is smugly saying “I told you so.”
The man who 29 years ago brought us the sci-fi hit The Terminator, about a future in which killer robots take over the world after computers are programmed to think, might as well be the godfather of a new movement that is pressuring countries around the world to ban the creation and use of machines that would be able to choose and fire on targets without human intervention.
His fan boys in this campaign include academics, Nobel Prize winners and Human Rights Watch. (And maybe unemployed “actor” Arnold Schwarzenegger, the star of The Terminator series, whose famous catchphrase “I’ll be back” obviously doesn’t refer to the current state of his film career.)
As in Cameron’s script, they’re worried that the future of warfare could be placed in the hands (circuits?) of fully autonomous robots. Though we don’t yet have that capability (and here we pause to allow all The Terminator acolytes to scream: “Remember Skynet! Remember Skynet!”), they warn that we will within the next three decades.
Legendary science-fiction writer Isaac Asimov foresaw many of the potential problems with robot-human interactions and penned the Three Laws of Robotics in order to ensure human lives would never be compromised by a machine’s actions. Trouble is, Asimov’s laws would render military robots useless.
Maybe that’s not such a bad thing. For the many of us who struggle to figure out how to even get our cell phone alarm to stop going off, it’s not a stretch to envision computers doing other dastardly deeds. Just ask anyone who has faced the blue screen of death minutes before a project deadline.
Even little old Canada, the nicest country on earth, is exploring the use of technology that could lead to more efficient killing (hey, it’s all about productivity in today’s world). Just this week we sent a dishwasher into space. Well, OK, it was a military satellite, but it was the size of a dishwasher. The other, more compact satellites are still snickering.
It’s not the machines that should give us the futuristic heebie-jeebies, though. It’s the people who program them. Computers today are truly a “chip off the old block,” doing whatever we tell them to do (which is why most parents would trade their teenagers for one in a nanosecond).
That can be very good for humanity (the Internet connects the world), or very bad (the Internet connects the crazies of the world). So, just as in The Terminator, the biggest concern should be controlling the people who are behind the technologies of tomorrow.
It’s one thing to recruit video game players to steer the armed drones now common on the battlefield; at least they have conscience, complete with its emotional complexities, to guide their decisions on pressing the trigger.
But it’s another thing completely to write programs that allow a machine – working purely on algorithms – to decide when to open fire on people.
Robots may save lives by diminishing the need for soldiers in war, but there needs to be safeguards controlling their use.
Human Rights Watch is only partially right: we need not a complete ban, but rather strict protocols specifying how humans would control any military machine.
Anything less will simply prove Cameron right. And his ego is already big enough.
Image credit: Carolco Pictures