Popular physicist Stephens Hawking made headlines again this week by suggesting humanity should unite under one world government. The reason cited by the eminent scientist for advocating what many would consider an extreme and very unpopular position — SKYNET is coming. SKYNET, of course, was the sentient AI that began eradicating mankind in the Terminator franchise of movies; and Hawking is terrified that this possibility might become a reality for us. In 2015, Hawking, along with other scientific and technologically oriented great minds, signed an open letter warning the world of the inherent danger the exponentially accelerating capability of artificial intelligence may pose for the future of humanity; demanding that restrictions be imposed relating to mandatory fail-safe devices and programming be included in all such machine intelligences. How do we insure that all tech producers are complying with this demand? Hawking suggests one world government.
“But that might become a tyranny,” Hawking said. “All this may sound a bit doom-laden but I am an optimist. I think the human race will rise to meet these challenges.”
The inherent danger in centralization of power across the world, as pointed out here by the scientist, is that of a tyrannical regime rising to power (i.e. “The Hunger Games,” or “1984“) which could in itself possibly usher in a dark age of humanity; but Hawking thinks it’s worth the risk. Unrestricted AI would become so powerful, so quickly, he says, that we would be like ants in comparison.
“The real risk with AI isn’t malice but competence. A super intelligent AI will be extremely good at accomplishing its goals, and if those goals aren’t aligned with ours, we’re in trouble. You’re probably not an evil ant-hater who steps on ants out of malice, but if you’re in charge of a hydroelectric green energy project and there’s an anthill in the region to be flooded, too bad for the ants. Let’s not place humanity in the position of those ants.” – Stephen Hawking
What do you think? Will AI capability spiral out of control and end up eliminating humanity in the process at some future date? Will we be able to control it or perhaps even harness it’s capability to enhance our own? Let us know on Facebook or in the comments below.