Industryweek 2278 17008graph

How to Get IT Energy Costs Under Control

Aug. 7, 2008
Virtualization offers an energy-efficient solution to reduce power consumption.

"The data center in today's facilities has become a hotbed of power consumption," says Dave Hart, chief technology officer with Presidio Networked Solutions, a provider of advanced IT infrastructure solutions. Six years ago, a single network server consumed about 100 watts of power, whereas today's servers consume approximately 400 watts, he points out. What's more, many software applications require their own dedicated servers to ensure performance and reliability, further adding to the energy consumption. And on top of all that, Hart adds, "while consuming four times the power of earlier models, servers also were moving to a smaller form factor, meaning that more of them could be placed together on a rack -- concentrating heat output and allowing power and cooling costs to run amuck."

Fortunately, the emerging trend towards virtualization has "changed the equation," Hart says. "The concept behind virtualization is that a single server can be made to act like multiple servers by hosting many applications, databases or systems. Today, facilities are routinely reducing their number of servers by a factor of 10-to-1. In using one server to host a single application, most servers today use just 5% to 12% of their capacity. By allowing that server to host up to 10 or more 'virtual machines' -- each running a single application -- and to put the additional capacity to good use, facilities can reduce power consumption and heat output, and therefore data center costs, by up to 90%." In other words, Hart notes, one server can now do the work of 10, thanks to virtualization.

Does your organization have a long-term plan or strategy targeted at reducing the environmental footprint of its IT infrastructure?
Source: Cutter ConsortiumHart points to a report from the Environmental Protection Agency (EPA) that indicates energy consumption by servers and data centers has doubled in the past five years, and is expected to almost double again in the next five years to more than 100 billion kilowatt hours, costing about $7.4 billion annually. "Through 2009, energy costs will emerge as the second highest operating cost in 70% of worldwide data center facilities," adds Michael Bell, research vice president at Gartner. At that point, he says, energy costs will be exceeded only by labor as the major data center operating costs.

The bottom line, Hart says, is that "virtualization reduces space requirements, power consumption and cooling costs, while returning another benefit -- the ability to forestall or even forego data center expansion as IT requirements increase." A side benefit, he adds, is that by reducing carbon dioxide emissions, a company can meet local or industry ordinances. For instance, "global energy efficiency programs," such as those from utility companies PG&E and SoCal Edison, are now providing incentives from $150 to $300 per server removed.

See Also

Popular Sponsored Recommendations

Voice your opinion!

To join the conversation, and become an exclusive member of IndustryWeek, create an account today!