13 September 2022

DeepMind AI Learns Soccer

Artificial intelligence has learned to play soccer. By learning from decades’ worth of computer simulations, an AI took digital humanoids from flailing tots to proficient players. Researchers at the AI research company DeepMind taught the AI how to play soccer in a computer simulation through an athletic curriculum resembling a sped-up version of a human baby growing into a soccer player. The AI was given control over digital humanoids with realistic body masses and joint movements. The first phase of the curriculum trained the digital humanoids to run naturally by imitating motion-capture video clips of humans playing soccer. A second phase involved practicing dribbling and shooting the ball through a form of trial-and-error machine learning that rewarded the AI for staying close to the ball.

The first two phases represented about 1.5 years of simulation training time, which the AI sped through in about 24 hours. But more complex behaviours beyond movement and ball control began emerging after five simulated years of soccer matches. The third phase of training challenged the digital humanoids to score goals in two-on-two matches. Teamwork skills, such as anticipating where to receive a pass, emerged over the course of about 20 to 30 simulated years of matches, or the equivalent of two to three weeks in the real world. This led to demonstrated improvements in the digital humanoids’ off-ball scoring opportunity ratings, a real-world measure of how often a player gets in a favourable position on the pitch. Such simulations won’t immediately lead to flashy soccer-playing robots.

More information:

https://www.newscientist.com/article/2336132-deepmind-ai-learns-to-play-soccer-using-decades-of-match-simulations/