Virtualized storage will cut costs

As today's e-businesses applications become increasingly datacentric, storage always seems in short supply.

Add the increased capacity requirements of multimedia and on-demand video being deployed more often by human resources and marketing departments, and the task of keeping up with storage requirements can seem insurmountable.

In today's IT landscape, it is not uncommon to find complex networks of diverse, multivendor storage technologies taking on much of the challenge. Anyone responsible for ensuring storage availability across mixed networks knows what a challenging and expensive undertaking this can be.

Although existing solutions have made local access to LAN and WAN data sources possible, the hassle of running disparate protocols for accessing storage devices remains, as does the difficulty of accurately tracking storage capacities which often results in waste.

Many vendors offer to resolve the data dilemma, but you must find one that matches your needs. These options often require expensive and proprietary investments.

With a multitude of devices spanning the typical enterprise, data access can be difficult, backups are typically less than seamless, and taking full advantage of your storage capacities often remains impossible.

And that is where storage virtualization can be your friend.

Storage virtualization offers a means of addressing storage functionally rather than physically. Storage virtualization extracts the physical process of storing data through software -- and sometimes hardware -- layers that map data from the logical storage space required by applications to the actual physical storage space.

Users can access storage without needing to know where a device is or how it's configured -- and without incurring significant performance overhead.

As an example, virtual storage would enable multiple, low-cost, commodity disks across the network to appear to the user as a single, multiterabyte disk. The disks could even comprise optical storage devices or a cache management system, but the actual implementation operates in complete transparency to the end-user or application.

Furthermore, because the disk appears as a single entity to the network administrator, virtual storage disks make it easier to redistribute, reconfigure, and add to storage capacity, allowing for more effective management and improved performance.

Because any reasonably priced, commodity storage components can comprise the actual back end, and Pentium-based servers can replace expensive proprietary boxes, the open architecture of virtualization benefits your long-term ROI. You'll enjoy lower costs and an improved life span for your current hardware.

Adopting storage virtualization management software has many benefits, but it carries some caveats.

I often hear the terminology of storage virtualization bandied about without regard for implementations or standards. Although this year will likely see most SAN (storage area network) vendors jumping on the bandwagon and implementing some virtualization features in their product offerings, the number of storage virtualization vendors almost matches the number of implementation standards.

Currently, though, most vendors seem to be building unto their own vision: some optimized for speed, others for ease of maintenance, whereas some focus only on improving backup capabilities.

Early adopters will be challenged by the fact that no standards exist among vendors. Be certain you know what your vendor's interpretation of storage virtualization is and how its implemented solution will benefit your business.

There are proposals on the table, notably from Hewlett-Packard and Sun Microsystems, for developing an interoperability framework. These efforts aim to improve the communication capabilities of cross-management systems, and they include handling requirements for many storage devices. But full adoption by vendors is a ways off.

You can find some good starting points for storage virtualization from companies such as Veritas (www.veritas.com), DataCore (www.datacore.com), IBM, and HP, although most any vendor with whom you work is likely to have a solution.

It's time to stop putting out fires and start flooding your storage pool with a common-sense alternative like storage virtualization. As the information glut grows, adopting open, scalable architectures will be imperative to meeting capacity demands in a cost-effective manner.

How is your business combatting data overload? Let me know at james_borck@infoworld.com.

James R. Borck is managing analyst in the InfoWorld Test Center. He reports on issues of e-business.

Join the newsletter!

Error: Please check your email address.

More about DataCoreHewlett-Packard AustraliaIBM AustraliaLogicalSun MicrosystemsVeritas

Show Comments