Soaring wholesale prices have pushed business electricity bills up to 19.9 per cent as of July, meaning it’s more important than ever for businesses to assess energy usage and operational effectiveness. Our hunger for ridiculous amounts of data and content streamed in real time is driving demand for IT processing to new levels. But it’s the accelerating energy requirements needed to cool IT equipment that are costing business’ thousands.
According to a report by McKinsey, the IT industry as a whole consumes two per cent of the world energy, and will generate as much CO2 as the airline industry by 2020. Data centres contribute a huge part of this energy consumption, in fact, research as far back as 2009 revealed the figure is close to 1.5% of Australia’s total energy consumption.2 More than likely, this figure would have significantly increased considering the substantial number of new data centres that have been built here since then.
Data centre containment and cooling is essential, this much is known. But, not all containment is equal, and most businesses don’t know that a combination of the right containment and cooling configuration will save them many thousands of dollars every year. Many businesses install containment and automatically assume their energy usage will be cut, while others tinker with the cooling configuration and disregard the impact of air mixing.
Even though containment was first introduced into IT facilities 11 years ago, up to 85 per cent of Stulz customers surveyed3 still aren’t using containment as a method to better utilise distributed airflow, meaning their critical infrastructure may either be ineffective or inefficient or both, and could be costing them. Given there has been significant innovations in fan and compressor technology during this time why would there not be a larger emphasis placed in these two energy sapping areas? The truth is, it’s not just about investing in the containment infrastructure, it’s about the method used to manage the environmental conditions within the data centre too.
The cooling challenge
To work effectively, its industry best practice for a data centre to have at least a cold aisle / hot aisle configuration, use of a containment system and blanking panels in the racks. This structure ensures all the cool air passes through the front of the servers with the same airflow quantity and temperature. The absence of this arrangement jeopardises an effective airflow management strategy and can result in:
• Risk of IT system failure
Cool air is not provided in all parts of the rack that and hot spots develop.
• Poor efficiency
It is a misconception that to maintain optimum temperatures users need to over cool the supply air. Many data centers are uncomfortably cold in the cold aisle. Our audit service has found that 60% of facilities are using more than twice the airflow needed by the IT equipment.
• Escalated costs
When air is allowed to recirculate within the rack or between the aisles – it is simply wasted energy.
Why users need a holistic solution
A holistic, engineered approach to air containment is the best strategy to eliminate air mixing in the room and create a more constant and consistent air temperature on to their IT equipment, as well as provide greater energy efficiency. The managed approach has five steps:
Step 1: Real time thermal audit
Before operators make any investment decisions they must understand how their data centre is really working. The process should always start with a comprehensive real time thermal audit of the data centre using multiple sensors. This audit will provide ‘granular’ level conditions such as floor pressure, temperature at the racks and cooling equipment over a period of time. This measure will tell users how effective or how well the cooling equipment’s efforts are being utilised.
Step 2: Analyse the results
The second step involves recording the IT loads and estimating cooling equipment capacities, and then calculating the difference. This measure will tell users whether they are providing this cool air in an efficient manner – the ‘efficiency gap’.
Step 3: Optimize the environment
With clever real time monitoring systems, users can comfortably tweak the fan speeds, and adjust environmental settings of the computer room air conditioning equipment and immediately see the effects of this change. It is advised that users keep tuning or tweaking until they reach the optimum or have found a balance between SLA’s and efficiencies. It is important to ensure that during this process the raised floor (if one exists) is correctly balanced, and the rack are sealed with blanking panels in any empty spaces, and any other non-productive openings are also sealed.
Step 4: Measure the results
Once this is completed the data centres new power consumption can be measured and the predicted savings will be visible. And with more available room cooling capacity businesses can now deploy additional IT projects or other investments, as they have more capital to do so.
Step 5: Sustain and maintain the environment
But the process doesn’t stop there; IT rooms continuously change, new racks go in, servers are added, removed or replaced, floor grilles are adjusted, and computer room cooling equipment controls are adjusted. All of these will impact and even destroy all the hard work that was done during the optimisation phase. Therefore, it is recommended that an optimisation plan be incorporated into any provider’s maintenance program.
Containing energy and costs
A managed approach to airflow management will effectively reduce the power consumption of a data centre and will result in significant cost benefits. It’s that simple. But, the impact doesn’t stop there. By investing in a holistic containment solution users will also be contributing to their employer’s CSR policy, without compromising system resilience and extending equipment life.
Business type results
|Financial institution||56% fan energy savings 5% total facility energy savings. Extended life of existing equipment. Reduction in a significant number of floor grilles.|
|Large telecom company||60% fan energy savings. Improved capacity of data halls as a direct result of liberating stranded capacity|
|Multinational colocation facility||Pre-project fan energy reduced from 230kW to 63kW
in one data hall (a reduction of over 350%). Substantially reduced. Power Usage Effectiveness
(PUE). Over $700K annual savings across four data halls.|
John Jakovcevic is managing director of Stulz Australia.