Jan 16, 2015 12:59 PM EST
Elon Musk AI Investment: Billionaire Spends $10 Million To Prevent ‘Terminator’ Scenario [VIDEO]

After physicist Stephen Hawking and Tesla Motors billionaire Elon Musk have repeatedly warned of a possible scenario where artificial intelligence becomes too much for humanity to handle, now the SpaceX entrepreneur has made it public: Elon Musk's AI investment reaches the eight figures as he tries to prevent Skynet.

After years of making the point that unsafe development of artificial intelligence could go on to potentially create science fiction-like dystopian scenarios, so now Elon Musk's AI investment has literally put his money where his mouth is, in an attempt to make the future development of artificial intelligence safe for human consumption.

According to Forbes, the Elon Musk AI investment comes in a form of an impressive $10 million to the Future of Life Institute, which is meant to go to the creation of a grand program to work out how to keep artificial intelligence safe for humans and avoid something as terrible as a robot revolution against the powerless humans (something not unlike the plotline of a great number of science fiction films, including this year's superhero movie "Avengers: Age of Ultron").

As EnGagdget reports, the Elon Musk AI investment was made public as the billionaire entrepreneur and inventor released a statement in which he referred to a petition to keep AI safe for humankind, saying he would be doing his part by donating $10 million for the development of an area of this science where robots could be programmed to actually stop themselves from being "smarter" than human intelligence, thus preventing a state of self-awareness that could potentially make them revolt against their creators, the ultimate science fiction nightmare.

Entertainment IE, on its part, states that Elon Musk's AI investment wouldn't only be about preventing so-called "killer robots," but also about enhancing mechanisms such as cyber security to keep it from going out of control.

 PREVIOUS POST
NEXT POST