Evan Schuman: Killer robots? What could go wrong? Oh, yeah ...

The UN wants to talk about killer robots as 'conventional weapons.' Someone needs to learn the IT facts of life: If something can go wrong, it will.

Has anyone at the United Nations ever used Windows? This question occurred to me after reading an article with the headline "'Killer robots' to be debated at UN."

This was the kind of news item that makes you double-check the URL to make sure it's not from a satirical site like The Onion. But no; it was from a real news organization, the BBC. According to the report, "Killer robots will be debated during an informal meeting of experts at the United Nations in Geneva" later this year.

The report defines a killer robot as "a fully autonomous weapon that can select and engage targets without any human intervention." The BBC adds that "the meeting will be held during the UN Convention on Certain Conventional Weapons." It's probably best to not ask when the UN decided that killer robots are now "conventional weapons."

I find this frightening -- and Windows has a lot to do with that. Well, the real reason is that I'm familiar with the IT facts of life. Websites operate perfectly fine for months and months and then, without anyone touching one line of HTML code, they suddenly glitch. Windows desktop units, years after blue screens of death were supposed to be extinct, freeze without any apparent cause. Mobile apps are developed and rolled out with egregious security holes.

And now we're talking about autonomous killing machines? Talk about your killer mobile apps! These could come with missiles and automatic weapons. The idea is that these devices would sharply reduce casualties, at least from the side that launched them. The problem is doing that while making the fatal IT assumption that computer programs never fail, fail-safes are always reliable and nothing will be overlooked. Sound like any IT operations you've ever seen on Earth?

Here are some more annoying bits of reality: cyberthieves, hostile governments and terrorists. Bad guys have been known to break into sites and then take over (scripts are far easier to trick than even government workers), with freedom to access -- or destroy -- unlimited data.

The technology hurdle that has to be cleared before we, as a society, even consider killer robots is this: strict quintuple-verified authentication. The robot's program must have a foolproof means -- with as many layers of backups and fail-safes as possible -- of verifying any orders it is given. Autonomy will have to be limited at least to the extent that the robots will respond to self-destruct or shutdown commands. But if the robots are truly autonomous -- well, just stream I, Robot or The Matrix if you're unfamiliar with the sorts of nightmare scenarios sci-fi has already shown us. And when it comes to killer robots, I think a nightmare is inevitable.

Of course, even an autonomous robot gets its commands from somewhere. It can only operate because it has commands known as programming. But if the robot's operating system goes all Windows on us ("Windows" being an easily understood euphemism for "FUBAR"), can the programmer who inadvertently (we hope) keyed in the error-ridden code be charged with murder or a war crime? Can that programmer be charged with disobeying an order, in that his or her code did not do what the programmer was told to make it do? Can that programmer's supervisor -- who was supposed to check all code -- be so charged, too?

There is some comfort in the fact that the UN truly is approaching this entire idea as a debate. It has invited two robotics experts, one of whom is extremely skeptical about autonomous killer robots (in fact, he's co-founder of the Campaign Against Killer Robots and chairman of the International Committee for Robot Arms Control -- and I am not making any of this up), the other of whom is more receptive to the idea but doesn't seem to be a backer with Dr. Strangelove intensity. Personally, my position is more extreme than that held by either of these two gentlemen. As someone who makes a living writing about IT efforts that deliver unexpected and unintended results, the idea of a truly autonomous killer robot is just plain terrifying.

Evan Schuman has covered IT issues for a lot longer than he'll ever admit. The founding editor of retail technology site StorefrontBacktalk, he's been a columnist for CBSNews.com, RetailWeek and eWeek. Evan can be reached at eschuman@thecontentfirm.com and he can be followed at twitter.com/eschuman. Look for his column every other Tuesday.

To express your thoughts on Computerworld content, visit Computerworld's Facebook page, LinkedIn page and Twitter stream.
10 super-user tricks to boost Windows 10 productivity
Shop Tech Products at Amazon