Source: flickr user The Planet
Moore’s Law, meet Jevons Paradox. That’s been the story of energy efficiency in the IT sector, where constant improvements in computing energy efficiency gains are subsumed by the ever-expanding hunger for the cheaper computing power those advances provide. While each new product is greener by virtue of better performance-per-watt, it’s hard to judge what weight that efficiency carries amongst customers. Sheer power capacity restraints — not enough substations per data center — are forcing data centers to constrain energy consumption. But reliability and performance still paramount for IT professionals, as the slow uptake of corporate cloud computing illustrates. At the same time, the cloud’s efficiency promises are coming under question — another green selling point waiting for real-world energy savings data to prove itself.
But with so much discussion around Green IT these days, 2011 could give some long-running assertions a chance to prove themselves. Here are some question I’ll be asking this year as I track the green IT sector’s progress — what others would you add?
Will companies trade Intel’s dominant x86 server architecture for low-power processing alternatives? 2011 is the year for lower-power challengers to Intel (and AMD’s) dominance of the server market. Atom-based servers from SeaMicro and the ARM-based systems from Marvell and startup Calxeda offer low power use. This is, however, in exchange for slower performance: Marvell’s 1GHz for less than 1 watt compared to the 3.6 GHz, up to 130 watt equation offered by Intel’s latest latest, to take one example. And earlier this month Nvidia announced its upcoming ARM-based server CPU for integration with its GPUs, another low-power alternative, although the move appeared mainly about getting out of Intel’s shadow. How quickly these offerings find their way into servers and data centers may help indicate the market’s valuation of energy efficiency.
Will cloud computing start calculating specifics of its energy efficiency benefits? Cloud computing is a far more efficient alternative to disparate, poorly-managed and out-of-date computing environments, and that should apply to energy use as well. Microsoft has claimed that switching to the cloud can shave off 30 to 90 percent of the carbon footprint of enterprise computing environments. But when the power used to transport data to and from the cloud becomes large enough, cloud computing can be a net energy loser. I’d be interested to see the math behind cloud computing efficiency across various use cases — data-intensive video streaming or storage retrieval versus simple web hosting and office support-type situations. That would give the industry more visibility into just how green their cloud computing plans really are.
Will office IT energy efficiency get the respect it deserves? The corporate computing environment is awash with energy-wasteful arrangements and habits, from underutilized servers in data centers to always-on desktop computers. Data centers are driven to energy efficiency primarily by growth constraints — investing in smarter, greener IT is a lot cheaper than building a new data center, after all. But office computing environments don’t appear to be investing as much to save energy in their networked IT environments, even when new technology can pay itself off in less than a year, as startup Veridem claims, or cut power bills by 20 percent, as Fujitsu promises. More advanced technology, such as French startup AVOB’s promise to ramp down processor speed and voltage for daytime power shaving, might face even steeper adoption rates amongst clients worried about sacrificing performance for energy savings.