Environmental issues may not be on the immediate agenda for most IT managers and CIOs but the amount of power IT consumes will certainly be under the microscope in years to come, according to analyst firm Gartner.
Steve Prentice, research VP at Gartner, said IT is uniquely positioned to confront increasing energy costs as the technology is one of the largest consumers of power within organizations.
"You cannot keep it a secret," Prentice said. "Performance-per-watt will become critical over next decade."
During his keynote address at this year's Gartner Data Centre Summit in Sydney, Prentice said that as hardware prices continue to fall and power costs continue to rise, heat generation becomes a problem when organizations take advantage of commodity hardware, "and it's getting worse".
"The requirement for modern hardware is going to be huge [and] for every watt that goes into a processor you will use another watt to cool it," he said, adding vendors' promotion of low-power consumption products is indicative of this trend.
Prentice cited the Barcelona Supercomputer Centre's 4800-processor cluster as an example of the scale of power consumption by computers. Reports have it that 75 percent of the university's electricity is being consumed by the supercomputer.
Seeking low-power alternatives is not the only way to reduce energy consumption, as virtualization technology emerges to provide better utilization.
With typical utilization rates of 10 percent, Prentice was blunt in describing the amount of capacity being wasted.
"Of the computing power you buy, you are throwing away 90 percent," he said, adding that with increased virtualization, utilization levels will climb to 30 percent.
"Virtualization is nothing new [but] it's a critical tech that opens a path to automation."
Virtualization will also herald the ability to reconfigure virtual servers to suit applications, resulting in big performance improvements.