After Google announced the partnership with Blizzard and its intention of testing the DeepMind engine against a real-time environment, everybody’s wondering about the following issue: can DeepMind beat StarCraft II?
In an attempt to extend the capabilities of DeepMind, the wunderkind of Artificial Intelligence, Google has recently announced a most unusual experiment involving one the most celebrated real-time strategy game in history.
After beating several human opponents in turn-based strategy games such as Go and Chess, DeepMind’s architects have decided that the deep-learning engine is ready to take on a new challenge. And so, on the 4th of November, during the Anaheim tech convention in California, Google DeepMind and Blizzard have announced their joint business venture.
How does this translate in terms of game experience and gameplay? How about the chance to wrestle against an opponent that has all the cunningness and determination of a human player and that can compute winning strategies as fast as a machine?
Over the following months, the DeepMind engine will be facing some of the toughest human opponents out there. Of course, in the first stages of the project, DeepMind’s engineers have predicted that the deep-learning AI will behave like a small kid who’s playing the game against human opponents for the first time.
During these encounters, DeepMind will need to learn the basics of engaging in an ever-changing real-time environment: manage resources, build defenses, create offensive units, take advantage of each unit’s unique abilities, make critical decisions during a battle, and formulate a strategy that can beat an opponent.
Think of this project as your own personal payback for all those times when the computer managed to catch you off-guard with only a few units constructed.
In light of this info, the question again pops into the back of our heads: can DeepMind beat StarCraft II? Probably not during the first phases of the projects. We may not be able to compute possibilities as fast as a software built on neural networks, but we can surely teach it a thing or two about how SCVs can be used for other stuff besides constructing bases.
The team overseeing the project has declared that DeepMind is a diligent pupil because it has already started playing the game on its own, developing working strategies on the go. But will this be enough to beat a human player in yearly tournaments such as the one hosted each year in South Korea?
If everything goes according to plan, computer scientists might be able to use this information in order to piece together a new type of AI, one that can simulate a human opponent’s way of thinking and acting.
Also, in the long run, such engines will be able to assist other researchers by evaluating real-time parameters and make quick and accurate decisions in the field. Can DeepMind beat StarCraft II? Maybe, but our advice to it would be to construct additional pillons.
Image source: YouTube