Credit: http://www.flickr.com/photos/theplanetdotcom/ (Creative Commons)
When it comes to business infrastructure, it doesn’t come much more critical than your enterprise data centre.
This article is part one of a two-part series.
Yet with server virtualisation well past the novelty stage, companies looking towards virtualisation-based Cloud computing and with the government’s planned carbon tax threatening major financial impact, organisations of all sizes are now moving to revisit, consolidate and improve their data centre infrastructure.
WorkCover Queensland, the state’s peak employer insurance body, is among them. Currently operating out of an in-house managed primary data centre and CITEC- managed disaster recovery (DR) facility, WorkCover Queensland is ramping up a massive four-year overhaul of its data centre infrastructure that will push management of that infrastructure to hosting providers.
“WorkCover does not see the operation of data centre services as a core business activity,” explains Trevor Barrenger, the body’s general manager of business solutions.
“We believe the best support in the future will be by moving away from owning backoffice ICT infrastructure and moving towards the provisioning of a hosted model. Some equipment will be physically moved between data centres and some end-of-life equipment will be replaced. In time, this will remove the requirement to own computer servers and storage equipment, and to replace ageing equipment on a regular basis.”
The body’s new site will be built around a dual data centre design, with redundancy of all equipment including 10Gbps backbone links and a variety of new HP-UX, Linux and Windows Server-based systems.
As well as improving resilience in the wake of the devastating Queensland floods, Barrenger anticipates the strategy will provide great benefits by helping the organisation leverage the service provider’s investment in power-efficient infrastructure, and refocus WorkCover technical staff on core technology services.
Green, Clouds are the new black
All across Australia, all kinds of major bodies — including big name organisations such as Victoria University, the CSIRO, ABC, Victorian Department of Treasury and Finance, and ASX, which is building a $32m data centre — are pulling up stumps and moving their long-established data centre infrastructures.
They’re headed to new facilities, or at the very least consolidating their servers into a smaller number of high-density racks to reduce wastage of space and power, and reduce the cost of cooling ever-denser racks.
It’s the natural evolution of a trend that started nearly a decade ago, with the advent of server virtualisation and the realisation that the one-box, one-application rule simply wasn’t going to cut it in the high-volume world of online business.
With much of the lowhanging fruit already picked and virtualisation well-established in Australia’s mid-sized and in Australia that’s not revisiting its power consumption with an eye to economy. And in nearly every case, power-guzzling 24x7 data centres — once a luxury, then a necessity, and fast becoming a liability — are coming into the firing line.
“With the carbon tax announced only recently, we’re already starting to see people take interest, particularly in how they’re going to measure and monitor the cost of data centres,” says David Hanrahan, general manager of virtual data centres with systems integrator Dimension Data.
“They’re still going to design their data centres for these workloads and coming capacity, but if they can put a Green story around it that’s even better.”
Green appeal is making the hard sell a lot easier for data centre infrastructure providers like Emerson Network Power, which recently worked with an unnamed bank to consolidate its data centre equipment after an executive mandate to implement carbon reduction strategies.
The resulting project saw a massive consolidation of servers into a smaller number of blade servers, with 60 racks’ worth of equipment squeezed into just 18 data centre racks each drawing up to 12kW of power. Cooling systems were revisited to implement ‘smart aisle’ and ‘cold aisle containment’ strategies, which focus cooling air on the high density equipment where it’s most needed.
A review of every aspect of the data centre even saw the data centre’s walls repainted and the colours of the equipment considered: With light-coloured equipment instead of dark, more light is reflected around the data centre, requiring less energy for lighting.
“The levels of saving were significant enough for them to choose a particular vendor that could supply equipment in that colour,” says Emerson senior sales director, Chris Mandahl.
“It was all about reducing their carbon footprint, and the only way they could do it was by data centre consolidation. Even though they’re using the same equipment and have the same output in terms of power and heat, they’re configuring them in a smaller area and a high-density environment that allows you luxuries to deploy energy-efficient strategies.”