Google data centers consume lots of power. By recent estimates, they have over 2.5 million servers that consumed 4,402,836 MWh of electricity in 2014, equivalent to the average yearly consumption of about 366,903 U.S. family homes. Over the years they’ve had scores of PhD’s focused on coming up with solutions to optimize data center efficiency. Then they unleashed machine learning on the machines.
Using the same AI technology that taught itself to play Atari and beat the world champion in Go, Google’s DeepMind machine learning algorithms now control 120 different variables in their data centers, constantly learning what combination of adjustments maximize efficiency. The result? Deepmind was able to achieve 15% reduction in overall power savings and a 40% reduction of energy used for cooling, translating into hundreds of millions in cost savings.
Commenting on these results, author and MIT professor Erick Brynjolfsson addressed the broader implications: “You can imagine if you take that level of improvement and apply it to all of our systems — our factories, our warehouses, our transportation systems, we could get a lot of improvement in our living standards.”
Apparently, we’ve barely scratched the surface: According to McKinsey: “while 90 percent of all digital data has been created within the last two years, only one percent of it has been analyzed, across both public and private sectors.” And behemoths like GE are fully on board with advanced analytics, spending $1 billion this year alone to analyze data from sensors on gas turbines, jet engines, and oil pipelines. If they can achieve Google-like results, the implications could be staggering.
A Thought Experiment
Most organizations don’t have the resources of Google or GE, but they do experience similar problems that could be solved with a better understanding of all the variables that impact performance and a mindset of constant improvement. It’s important to keep in mind; Google already had some of the most efficient data centers in the industry before they unleashed DeepMind on the problem.
Obviously, you can’t snap your fingers and suddenly become Google. So, perhaps a thought experiment is in order. One where you, for a moment, suspend disbelief, set aside current constraints, and think about what’s possible. With the Google example in mind, in what areas of your organization could you reap the greatest benefit with respect to, for example, production or servicing costs, or close ratios and customer retention that drive revenue? What are the key variables that impact each of these areas and if you had perfect information what would it tell you? If you come up with, for instance, five variables that impact customer support costs, try to come up with 10 or even 20. Challenge your team to do the same. The point is not to engage in some pie-in-the-sky exercise, but to appreciate the level of complexity inherent in any activity within your business, and to start to look for correlations between events, activities, behaviors, and outcomes.
Further, you need to challenge the conventional wisdom in your organization that reinforces that notion that finding the “single cause” for performance issues will result in optimal outcomes, when in fact understanding the broader collection of variables will likely produce better results. Google identified 120 variables just for data center energy consumption. How about you?