Planning virtualization moves wisely

CiRBA's data center intelligence tool

Not so long ago, if Underwriters Laboratories needed to add three additional servers worth of computing power in three weeks, the company just bought three new units, says Kent Walker, manager of computer operations for UL. With more time for capacity analysis - which is both labor and time-intensive - Walker might be able to shift resources around and stave off the purchase. But when the need is immediate, there's no time for that.

That's why CiRBA's data center intelligence tool appealed to Walker. The product is designed to automatically analyze a server installation and recommend ways to consolidate hardware using virtual servers instead of physical. Walker asked CiRBA in for a 100-server trial evaluation, to see if the vendor could help rein in his sprawl of underutilized servers.

"They dazzled us," Walker says.

Other tools can tackle pieces of the analysis, such as examining server utilization levels to plan capacity. But CiRBA's tool is comprehensive, letting users factor in technical, business and workload constraints specific to their organizations, analysts say. A financial institution, for instance, might need to stipulate that its trading and research groups not share the same virtual servers due to regulatory requirements.

CiRBA gathers information about the data center in a variety of ways, including agent and agentless discovery mechanisms, and stores the details in a central repository. The tool then applies rules and constraints, builds a multi-dimensional model of the data center and spits out answers on the best way to optimize it, whether it's a physical server farm, a virtual environment, or a mixture of both.

"Very few people have the majority of their servers virtualized, so our customers tend to put it on everything. They get a lot of visibility into both" their physical and virtual environments, says Andrew Hillier, CiRBA's co-founder and chief technology officer.

CiRBA's version 4.6, released last month, added advanced benchmarking to enable users to analyze how specific server workloads would perform on any virtualization platform and fit with other resources. It also built in probability analysis and workload-scoring strategies that factor in the risk and service levels an organization is willing to accept, as well as advanced network and storage analysis.

With those new features, the tool for the first time can be used to model how applications will perform on IBM's System z mainframes and determine which workloads running in Unix- or x86-based environments might be better suited to mainframes.

The upcoming 5.0 version, due in the US summer, will build dynamic models of the data center, tracking daily changes, according to Hillier. A subsequent fall release is expected to provide tighter integration with major system management frameworks, as well as enhanced support for storage analysis, he adds.

Join the Computerworld Australia group on Linkedin. The group is open to IT Directors, IT Managers, Infrastructure Managers, Network Managers, Security Managers, Communications Managers.

More about: Burton Group, Excel, Forrester Research, IBM, VMware
References show all
Comments are now closed.
Related Whitepapers
Latest Stories
Community Comments
Whitepapers
All whitepapers

Connected vehicle tech trial to start trucking on in NSW

READ THIS ARTICLE
DO NOT SHOW THIS BOX AGAIN [ x ]
Sign up now to get free exclusive access to reports, research and invitation only events.

Computerworld newsletter

Join the most dedicated community for IT managers, leaders and professionals in Australia