Four keys to effective storage use

The big picture: The on-demand, instant action-reaction business environment today requires real-time responsiveness to change -- whether it is to meet new demands by customers, changes in the supply chain or unexpected competitive moves. This greater dependence on information translates into greater dependence on the effectiveness of a storage infrastructure.

IT budgets are getting tighter, and IT managers are increasingly trying to do more with fewer resources. An Aberdeen Group report from December 2002 found that 70% of IT departments had little increase or a decrease in storage and storage management budgets for 2002, compared with 2001. So how can companies reconcile these two contrary realities?

Networked storage and storage consolidation projects are gaining popularity because they are seen as critical answers to the complex world of open storage. In the "golden age," when annual requests for 15% to 20% increases in IT budgets were automatically approved, departments simply purchased what they needed when they needed it. At the same time, storage utilization rates are often at abysmal lows -- within the 35% to 45% range. Moving forward, companies stand to lose even more as the complexity of managing storage across individual subsystems can lead to low administrator productivity and higher downtime from both planned and unplanned outages.

A total cost of ownership (TCO) study conducted by IBM in the fourth quarter of 2002 revealed that over the course of one year, the average mid- to large-size storage-area network customer with 5TB of storage could potentially save more than $170,000 in improved hardware utilization and more than $80,000 in improved administrator productivity with the right storage management system. The same study showed customers could save more than $2.6 million in lost opportunity costs due to better application availability through reduced planned and unplanned downtime.

Storage solutions can solve many of these challenges, but to do so, they must embody four key characteristics: They must be open, virtualized, autonomic and integrated.


When a storage system is truly open, equipment from multiple vendors can not only sit side by side, but it can also interoperate. This not only refers to server and software compatibility, but also the ability to have two competing storage servers harmoniously share certain information and centralize management functions. Until now, customers have been unable to integrate their networks into a true patchwork of technology because many storage devices are unable to share information with one another, leading to isolated data "islands." Imagine a public library where each section, from reference to literature to periodicals, were all located on opposite sides of town. This would hardly be an efficient way to manage a library, or information for that matter.

Hope for storage administrators is on the way in the form of the Storage Management Interface Standard (or SMI-S). After listening to the frustrations of their customers, vendors recognized there were important benefits to be realized if they could find a way to work together. Storage vendors came together through the Storage Networking Industry Association and developed the SMI-S standard to enable everyone to build technology that could work together, resulting in seamlessly linked information sharing. Thanks to SMI-S, the problem of storage management islands will soon be a thing of the past.


Storage virtualization products separate physical storage from the logical view seen by the servers. The physical storage could reside anywhere within a company's storage network and could be from any vendor, while the storage virtualization layer provides a logical view of all storage connected to it. Instead of individual storage systems remaining separate and distinct, storage resources are pooled together into a single reservoir of capacity. This allows business user departments or application owners to describe their storage needs in terms of capacity, response time, cost and backup frequency. It also allows IT administrators to make the best use of resources by dispensing storage capacity or processing power only when needed.

Gartner Inc. expects storage virtualization to gain prominence within the storage management arena as companies look for ways to get better use out of their storage resources. Gartner also predicts that storage virtualization worldwide will grow 14% from $200 million to around $400 million between 2001 and 2006.


If a key division or enterprise data repository goes down because of a disaster or virus attack at 3 a.m., how fast can a human storage administrator react? Will he be fast enough to save the strategic information contained on that database? What would be the impact of a single minute of delay?

Autonomic computing is about the ability of IT systems to self-configure, self-heal, self-protect and self-optimize. Autonomic storage systems can save a company from downtime faster than humanly possible.

Some storage systems come with predictive error analysis capabilities that can determine where a problem might occur and take corrective actions before it occurs. Other autonomic capabilities include automating fail-over and failback, balancing workloads and managing software upgrades without human intervention. All these processes are aimed at keeping critical data available to applications 24 hours a day, 365 days a year.

Less dramatic but just as important, they can help companies reallocate capacity and scale during peak hours, and reduce the number of administrative tasks that have to be done manually.


Integration is about connecting the dots. Fragmented and disparate storage systems are counterintuitive to the promises of storage networking. Through integration, companies can better use their resources and bring greater manageability, administration and TCO to their entire infrastructure including their servers, disk and tape backup systems.

In today's business world of instant action and quick reactions, information is key. An integrated, highly available and robust data storage infrastructure is indispensable. A consolidated and well-linked storage environment can enhance a company's ability to get information and respond quickly, on demand, around the clock.


As budgets and storage growth continue on their divergent paths, the gap between management expectations and the ability of the storage infrastructure to deliver will widen. To close that gap, companies will have to find new ways to improve the use, management and resilience of their data storage resources.

When push comes to shove, open standards, storage virtualization tools, autonomic capabilities and integrated solutions can make a storage manager more productive, the enterprise applications more available and the storage assets optimized.

- Brian J. Truskowski is general manager of IBM Systems Group's storage software division. Previously, as chief technology officer of the storage group, he was responsible for the overall technical strategy of the storage systems group.

Join the newsletter!


Sign up to gain exclusive access to email subscriptions, event invitations, competitions, giveaways, and much more.

Membership is free, and your security and privacy remain protected. View our privacy policy before signing up.

Error: Please check your email address.

More about Aberdeen GroupGartnerIBM AustraliaLogicalStorage Networking Industry AssociationSystems Group

Show Comments