Not content with owning some of the most efficient data centres on the planet, Google are now using machine learning to gain a greater understanding of their server farms. At the Data Centres Europe conference, Google’s Joe Kava detailed how they were using neural networks to sift through the immense amounts of information harvested from their data centres, and make recommendations to improve efficiency. In short, Google have created artificial intelligence that knows more about Google’s server farms than the humans who run it.

“In a dynamic environment like a data center, it can be difficult for humans to see how all of the variables interact with each other,” Kava stated. “We’ve been at this (data center optimization) for a long time. All of the obvious best practices have already been implemented, and you really have to look beyond that.”

The system was designed by Jim Gao, nicknamed the ‘boy genius’ by his colleagues due to his impressive analytical skills. He designed a machine learning algorithm and fed into it 19 variables which effect efficacy, such as IT load, weather conditions and the operations of the cooling towers, water pumps and heat exchangers. The algorithm could then analyse the data from Google’s hundreds of millions of data points, figure out the complex patterns and relationships between the variables, and make recommendations on what to adjust in order to use power most effectively.

The machine can now Google’s Power Usage Effectiveness with 99.96 percent accuracy. Although the tweaks suggested by the system may appear small, when rolled out across Google’s tens of thousands of servers, the savings could be huge.

Read more here.

(Photo credit: Google)


 

For more articles:

Previous post

Another University Adds Big Data To Their Syllabus

Next post

IBM Branches Out Into Condiments