Having defeated Atari’s pixelated space monsters, Google’s artificial intelligence team, DeepMind, has moved on to a bigger target: fighting climate change by reducing energy waste.
Two years ago, Google’s DeepMind team began using their machine learning algorithms—computers which learn and adapt beyond their initial programming—to improve the efficiency of the cooling systems at their data centers. They announced this week that the new technology had reduced the auxiliary energy use by a staggering 15%, resulting in huge savings for Google and the environment.
In 2014, Google purchased DeepMind, a British artificial intelligence company for roughly $500 million. Since then, the DeepMind team has demonstrated serious game, producing an artificial intelligence agent that learned to play classic Atari video games and a computer program, AlphaGo, that beat the world’s champion at the board game Go.
Using similar ideas from machine learning, the DeepMind team has set its algorithms on a more ambitious track: optimize the cooling systems in their data centers.
Computers produce heat, and the bigger the computer, the more heat it produces. Your laptop might have a small fan on the side, but Google’s data centers—which are filled with massive servers—have elaborate industrial cooling systems. Optimizing the huge number of components involved in cooling the servers—fans, windows, etc.—is tricky. And accounting for fluctuating variables, like weather and server usage, is even more difficult.
But DeepMind’s computers handle this type of complexity much better than a human can.
DeepMind announced on its blog:
We accomplished this by taking the historical data that had already been collected by thousands of sensors within the data centre—data such as temperatures, power, pump speeds, setpoints, etc.—and using it to train an ensemble of deep neural networks. Since our objective was to improve data centre energy efficiency, we trained the neural networks on the average future PUE (Power Usage Effectiveness), which is defined as the ratio of the total building energy usage to the IT energy usage. We then trained two additional ensembles of deep neural networks to predict the future temperature and pressure of the data centre over the next hour.
In other words, the algorithm learned what settings minimize the PUE, which measures how much energy the data center is using for auxiliary systems, including cooling towers. The result was a 40% reduction in the amount of energy used for the cooling system, which accounts for a 15% reduction is total overhead energy use at the data centers.
Industrial systems—like data centers—have a huge carbon footprint. Google is hopeful that DeepMind’s machine learning technology can be applied to make other energy systems more efficient and reduce their impact on the environment.