One issue with the ever growing scale of scientific research projects that continues to crop up is the energy usage such facilities require to run. For example, the nuclear physics laboratory that I worked at previously at Michigan State University was far and away the largest energy user on campus. And with a new and improved facility on the way, the university is having to expand and upgrade its current power plant to accommodate its needs.
But that pales in comparison to a place like CERN, home of the Large Hadron Collider, which consumes around 1,000 gigaWatt-hours per year. Add to that the energy requirements of all of the different data stations around the world that store and transmit information from the experiment, and you’ve got yourself a decidedly un-green project on your hands.
For all of the energy that large data centers like these require, however, it is actually the best way to go. Let’s flip this on its head for a second, and take into account the enormous amount of energy used by personal computers, local data storage devices, and other business electronics used in offices across the entire country. Those clearly outweigh centralized banks of super data centers, right?
According to a recent report from Northwestern University, yes indeedy.
A six-month study in conjunction with Lawrence Berkeley National Laboratory has found that moving common software applications used by 86 million U.S. workers to the cloud – software such as spreadsheets, file sharing, word processing, email, customer relation management software, etc. – would save enough electricity to run the entire city of Los Angeles for an entire year. The energy cut would be about 87 percent, or about 23 billion kilowatt-hours.
Of course, that says nothing as to the security issues and confidentiality requirements that said businesses would somehow need to ensure, but this is likely more of an academic thought experiment more than an actual this-is-going-to-happen study.
What’s also neat is that all of the calculations that went into this report, all of the data, assumptions, models, computer coding, etc. – is now out in open access form. The model allows anyone to tinker with its assumptions, numbers, or calculations to bend it to their own use. For example, the manager of a data center could compare current energy requirements with those if the center were moved to the cloud. Such tools are essential to convincing administrative type people that such a move makes too much economic sense not to implement.
And if enough CEOs can be convinced that it will help their bottom line, maybe moving to the cloud will eventually help reduce our overall energy requirements of the future.
The report, “The Energy Efficiency Potential of Cloud-Based Software: A U.S. Case Study,” was funded by Google and written by lead author Eric Masanet of Northwestern University, Northwestern graduate students Jiaqi Liang, Xiahui Ma, and Ben Walker, and Lavanya Ramakrishnan, Valerie Hendrix, Arman Shehabi, and Pradeep Mantha of Berkeley Lab.