You've Got the Power
Earlier this month, McKinsey & Co. released a study that undoubtedly surprised both the airline and technology industries. By McKinsey’s estimates, data centers will surpass airlines when it comes to greenhouse-gas emissions by 2020. (Click here for the full McKinsey report.)
The study also points out two poorly kept IT secrets: Most servers run at just 6 percent of capacity on average, and energy consumption isn’t typically factored into the total cost of ownership of server resources.
Obviously, server virtualization is one powerful tool available to IT managers for addressing the server utilization issue. Despite the fact that most IT managers despise server sprawl, until server virtualization on x86 machines became mainstream, there wasn’t an easy way to get more mileage out of existing processing resources while also providing the quick and cheap uptime that most business users demand.
So, what’s keeping more IT managers from tackling the power issue? Lack of time? Lack of management interest? Lack of IT interest? Definitely, there’s a correlation among these three factors, but probably the biggest hindrance is the lack of data. And unfortunately, that’s not a particularly easy problem to fix.
When it comes to power consumption, it’s an issue too few examine. Although virtualization vendors tout reduced power consumption as a major benefit, not many IT managers view cutting electrical costs as a top consideration. A recent BizTech poll showed that only 6 percent consider reducing power consumption a reason to implement server virtualization, and only 9 percent of IT managers say that they measure the power consumption of their company’s computers. Most businesses aren’t even capable of separating IT’s electricity costs from that of other departments.
Still, power costs are a large part of overall IT expenditures — and growing exponentially. A recent IDC study estimated that the electrical cost of powering and cooling data centers worldwide would reach $44.5 billion by 2010. That amount represents roughly 70 percent of new server spending. And as more businesses start to scrap their paper-based systems in favor of digital-information management, the Environmental Protection Agency expects to see an increase in power consumption from the nation’s data centers, particularly those in the federal government. As the single largest consumer of power from data centers in the United States, the federal government has been given a mandate from the executive branch to cut data center power output 30 percent by 2015.
There’s no doubt that IT’s power consumption contributes significantly to the operating costs of businesses in the private sector as well. We’ll soon see IT managers taking a closer look at power usage, because server power consumption is costly and in some cases may be unnecessary.
Editor in Chief
The U.S. Environmental Protection Agency estimates that data centers with computing, network and data storage equipment consumed about 60 billion kilowatt-hours of electricity, or 1.5 percent of the total U.S. electricity bill, in 2006.
Do you currently monitor the electrical consumption of your computer systems?
3% Don't Know
Source: CDW poll of 377 BizTech readers