5 Steps to A Big Picture Approach to Virtualization

Think far beyond your servers from the start if you want to reap virtualization's wider potential in the data center, says Dave Robbins, CTO, Information Technology for NetApp.

In the current economic climate, organizations are cutting IT projects that are unable to show a strong return on investment within twelve months. But buoyed by the prospect of increased efficiency, lower costs, quick return on investment, and a more flexible model to align with primary business functions, virtualization is one of the IT projects getting almost universal buy-in from CIOs.

While server virtualization is delivering on these benefits in the data center, many CIOs are now responding to the unintended consequences and complexities that are being created. These include virtual server sprawl, a lack of full realization of benefits, and increased shared storage costs, among others.

Here are five key considerations for CIOs and IT leaders who want to ensure that they are taking a holistic view of virtualization.

1. Assess your virtual environment readiness

You can't optimize what can't measure. Detailed assessment is required before full project commitment to understand your current capabilities, help justify current and future investments, and provide a baseline measurement that can be tracked and reviewed.

2. Investigate shared/unified storage to improve flexibility

A well-architected shared storage environment can help IT groups enable the advanced features of virtualized server environments without adding the overhead of additional management.

3. Revisit data backup and recovery plans

Virtualization means relying on fewer physical systems to process more tasks. Consolidating on fewer machines without redesigning data protection will lead to performance issues, impede recovery and limit overall project success. CIOs should carefully consider the best choice in data protection for their organization's specific needs, to minimize the impact of events such as data, system, or site failure and ensure information is properly protected.

4. Consider thin provisioning

Server virtualization enables the rapid provisioning of applications and administrators should consider thin provisioning for their environments to reclaim unused storage, improve storage efficiency and enable faster time to market.

5. Dig into data deduplication to maximize efficiency

By its very nature, virtualization tends to have a lot of duplicate data such as the same operating system and application software. Deduplication technologies allow an organization to use and manage less storage while still reaping the benefits of virtualization.

Dave Robbins serves as CTO, Information Technology for NetApp.

Join the newsletter!


Sign up to gain exclusive access to email subscriptions, event invitations, competitions, giveaways, and much more.

Membership is free, and your security and privacy remain protected. View our privacy policy before signing up.

Error: Please check your email address.

Tags virtualisation

More about NetAppNetApp

Show Comments