These servers may hold the key to smarter grids. Source: Flickr/Tophost
Utilities are raking in stimulus funds and smart meter manufacturers are working feverishly to imbue the grid with the smarts to redistribute the electrical load down to the household level when consumption rates spike. Envision automatically time shifting that EV charging or dish washing cycle to cheaper overnight hours and you’ve got the idea. It may sound complicated, but efforts to minimize data centers’ carbon footprints offer many lessons — and incentives — for just this sort of smart grid innovation.
Ever-increasing demands made on IT, cloud computing’s staggering growth and a booming web application space are requiring data center operators to take a more agile approach to how and where data is processed. Take, for instance, Google’s new data center in Belgium. It features a free cooling design completely devoid of chillers. What makes it truly remarkable is that the facility’s computational workload is redistributed to other sites for the few days of the year that the weather proves too warm for the servers and networking infrastructure.
Suddenly, local weather forecasting becomes a network management tool. Knowing days in advance that the temperature will rise, administrators have ample time to prepare for an orderly shutdown of equipment and the redistribution of server workloads. You can expect this to become an integral part of the IT team’s bag of tricks as corporations design more facilities that benefit from a region’s climate.
Applied to the demand-response side of the smart grid, similar technology can have an enormous impact. If a heat wave approaches, utilities can instruct smart meters to dial back or reschedule lower priority jobs to keep the load on the grid manageable and air conditioners humming.
Similarly, research on rerouting Internet traffic to lower energy costs and early efforts at prodding virtualized workloads to “follow the moon” could offer models to help utilities working to shift peak energy loads and tap distributed resources.
Researchers at MIT and Carnegie Mellon were primarily concerned about cost savings when they developed a system that uses real-time energy price monitoring to transfer computing loads to locations with lower prices. Katie at Earth2Tech points out that at this early stage the approach won’t necessarily cut consumption nor can it guarantee that power is derived from clean sources. But, as she notes, the technology has potential as a means of helping utilities better manage energy production and distribution so that fewer polluting power plants are built.
Under this model, utilities also will find that they are competing for the computational workloads of data centers, some of their biggest customers. If renewable energy sources and reliability data become factors in this load balancing scheme, and it’s only a matter of time until it does, utilities will have little choice but to better manage electrical loads and clean up their act. Otherwise, they risk losing the business of cloud providers and carbon-cutting corporations to rivals that can.