110799

Musk and Hawking demand that we outlaw killer robots

Both have warned about the risks of Artificial Intelligence, sign letter to keep it out of warfare

Financial trends and news by Steven Loeb
July 27, 2015
Short URL: http://vator.tv/n/3f15

Did anyone see the new Terminator movie that came out this summer? I did, and it wasn't very good. I mean, Schwarzenegger was as reliable as always, but the plot was ridiculous. It was hard to follow and it didn't make much sense. In fact, the last three movies in this series have been pretty bad, meaning there hasn't been a worthwhile one made in 24 years.

So why on earth do they keep making them? If anything, it's a great reminder about what can be if we made robots super smart.

Big names, and people much smarter than you and I, like Steven Hawking, Elon Musk and Bill Gates have all expressed their fear of AI and the potential harm it might do. And now two of those three have put their names on an open letter, also signed by AI and robotics researchers, explicitly calling for the technology to never be used in warfare.

"Autonomous weapons select and engage targets without human intervention. They might include, for example, armed quadcopters that can search for and eliminate people meeting certain pre-defined criteria, but do not include cruise missiles or remotely piloted drones for which humans make all targeting decisions," it says.

"Artificial Intelligence (AI) technology has reached a point where the deployment of such systems is — practically if not legally — feasible within years, not decades, and the stakes are high: autonomous weapons have been described as the third revolution in warfare, after gunpowder and nuclear arms."

The writers of the letter do acknowledge the one good thing about using AI in war: it means less soldiers giving up their lives. While that it something I think we can all get behind, it also means that it will make starting a war that much easier of a decision.

The other argument that could be made is that AI could just wind up like the nuclear arms race of the Cold War: sure, both sides stockpiled their weapons, but the idea of mutually assure destricution stopped them from ever being used.

AI and nuclear weapons, though, have some key differences.

"Unlike nuclear weapons, they require no costly or hard-to-obtain raw materials, so they will become ubiquitous and cheap for all significant military powers to mass-produce. It will only be a matter of time until they appear on the black market and in the hands of terrorists, dictators wishing to better control their populace, warlords wishing to perpetrate ethnic cleansing, etc," it says in the letter.

"Autonomous weapons are ideal for tasks such as assassinations, destabilizing nations, subduing populations and selectively killing a particular ethnic group. We therefore believe that a military AI arms race would not be beneficial for humanity. There are many ways in which AI can make battlefields safer for humans, especially civilians, without creating new tools for killing people."

That Hawking and Musk would sign this letter is not surprising In an interview with the BBC, Hawking once said that "the development of full artificial intelligence could spell the end of the human race."

"Once humans develop artificial intelligence it would take off on its own and redesign itself at an ever-increasing rate. Humans, who are limited by slow biological evolution, couldn’t compete and would be superseded," said Hawking.

Musk, meanwhile, has been on the forefront of the the anti-AI movement for a while now, calling it the "biggest existential threat" to humanity.

"“I’m increasingly inclined to think that there should be some regulatory oversight, maybe at the national and international level, just to make sure that we don’t do something very foolish.”

Other signers of the letter include Apple co-founder Steve Wozniak; Jaan Tallinn co-founder of Skype; MIT professor Noam Chomsky; Demis Hassabis, CEO of Google DeepMind; and Peter Norvig, Research Director at Google. 

Personally, I don't like artificial intelligence. Sure, there's plenty of good that can be both done with it, but even the idea of a robot that can try to read your emotions creeps me out a little. Maybe I just have seen too many Terminator movies to ever fully trust it.

(Image source: medium.com)


Related news