Google Machine Masters Video Games

Google has developed a system the same as a teenager. It was an artificial-intelligence system which spends all of its time in playing and mastering video games, according to bloomberg.com. Google presented this latest Google machine on Wednesday, 25th of February 2015, describing the machine-learning technology as the primary significant rung of the ladder. It is to build intelligent AI which can shape out how to do things on its own.

Google machine, which is inspired with a human body, learned how the 49 classic Atari games are played. Among the games this Google machine played are Pong, Space Invaders, boxing, tennis games and the 3D-racing challenge Enduro. Overall, it was good or even better than a professional and real human player. Researchers from the Google Deeepmind stated this is the first time ever that a system had learned on how to become an expert in various complicated tasks.

The research project responsible for the development of Google machine is a London start up called Deepmind. Google have acquired them last year with reported price of £400m, according to bbc.com. However, this Google machine is not the only machine which has mastered complicated games because in the year 1997, Deep Blue, a chess playing computer from IBM popularly defeated the world champion Garry Kasparov in a chess game. Nonetheless, this machine was great since it was pre-programmed with some instructional manuals which provided it the mastery it requires in order to excel at the chess game.

The vice president of Deepmind's engineering department, Dr. Demis Hassabis said that up until now, self-learning technologies have merely been used for easy problems. This is the first time, where they have used a perceptually rich environment in this Google machine in order to accomplish tasks that are really difficult for humans.

More News
Real Time Analytics