Cash for stash

For IT execs getting the cost/benefit message across not only to business units but to the bean counters is often an argument that can't be won - at least not until that document vital to the chief bean counter makes itself untraceable.

So how do you win the battle?

Paul Stonchus, IT manager at a large bank, says he needs his systems and business analysts to understand the bank's tiered storage technologies so they can better utilize tiered storage for their applications. "I can't make ILM (information life-cycle management) a reality all by myself," Stonchus says.

He manages an all-EMC environment that is made up of about 5 percent DMX, 65 percent Clariion and 35 percent Centera arrays as well as tape storage. But the bank's mainframe computer output to laser disk (COLD) application is the only one that takes advantage of tiered storage. He plans to extend tiered storage benefits to his e-mail archive application by the end of the year.

Speed is another part of the battle and according to Craig Tamlin, manager for Quantum ANZ, is an increasingly challenging area.

"These days, storage systems are far quicker at storing and retrieving data than the hosting server's capability to manage it. For example, an average Windows server serves up data to a tape drive at between 15-25MB/sec, yet we have tape drives on the market which can perform at 90-120MB/sec should the source data rate be able to support it. These tape drives are a better fit therefore for Unix systems or backing up a SAN," he said.

On a cost basis, Tamlin says there is nothing cheaper than tape. With Quantum's latest release, DLT-S4 offering up to 1.6TB per tape (assuming 2:1 compression), the cost is 9c per GB (including GST, he added).

Retrieval times can also add heat to the debate and that is where a cost-benefit analysis needs to be performed.

Tamlin says the cheapness of tape is offset by its sequential nature.

"Rapid random retrievals of small chunks of data from tape is relatively slow. Clearly, random access devices, such as disk, are better suited to such retrieval workloads but it is fascinating to talk with many clients and discuss their needs for data retrieval.

"All say that they want faster retrievals, but when quizzed about their restore profile (how much data and how frequent the restores will be), for some, say they rarely."

Tape is the better solution for such clients, Tamlin says, while other clients, who have high volumes and frequencies of retrievals will definitely obtain advantages offered by a layer of disk in their secondary storage environment.

"There is a third category of those who want disk, but can't afford the right disk. They purchase capacity-oriented disk which has sufficient performance for small random retrievals, but is slower than tape for the nightly backup; this is a backward step.

"If retrieval frequency and volume are high, so should the budget allocated to the task," Tamlin said.

Yet another problem that execs encounter is the architecture and maturity of products that manage the movement of data between tiers of storage. Ruth Mitchell, a storage administrator at a university, used to move files between different tiers of back-end storage using the Hierarchical Storage Management feature in IBM's Tivoli Storage Manager for what had been an all-AIX environment. When the university switched to Windows a few years ago, IBM did not have native Windows support, and Mitchell could not find a third-party product mature enough for her shop.

Page Break

To deal with such hurdles, new classification and virtualization products are emerging that allow users to get the full benefits of tiered storage deployments. Scentric sells a product called Destiny that allows users to discover, classify and move data on Windows servers and CIFS-compliant network-attached storage (NAS) appliances without the use of agents.

George Rodriguez, an IT manager at ABC Distributing, recently introduced tiered storage into his shop after using Scentric's Destiny data classification tool to help the savings from ILM materialize.

Before using Destiny, Rodriguez was constantly allocating more Tier 1 storage on his high-end IBM Shark array to deal with the data growth on his Windows servers. "I had the IBM Shark divided between mainframe and Windows but had to occasionally empty IBM Shark LCUs (logical control units) assigned to the mainframe so I could re-assign them to my Windows servers," Rodriguez says.

Using the reports Destiny produced, Rodriquez was first able to show management and users how many old files they were storing and how infrequently those files were being accessed. Rodriguez now sets policies that automate the classification, movement and retention of files on his Windows servers.

Scentric's reporting feature allowed Rodriguez to justify to his managers the purchase and introduction of another tier of storage at half the price of Tier 1 storage without increasing management overhead and costs. Rodriguez configures Destiny to keep files on Tier 1 storage for 45 days, on Tier 2 for another three months and then moves the files to another location where they are backed up and then deleted from disk. Rodriguez's says his only regret so far is, "I wish Scentric offered similar functionality for the mainframe."

Organizations are also deploying array-based block virtualization to better manage and control their tiered storage environments. Fidelity National Title Group Inc. in Jacksonville, Fla., covers nearly one-third of all U.S. real estate title insurance policies and needed a system that would handle the 2.5 million transactions it processed annually. The system also had to maintain its online repository of 45 million historical documents.

To manage the data, the insurance company implemented a three-tier storage design that uses Hitachi's TagmaStore Universal Storage Platform (USP) for Tier 1 high-performance storage requirements, which in turn virtualizes the company's Tier 2 HDS Thunder 9585V modular storage systems. This architecture enables non-disruptive migration of their data between tiers and access to data with lower-performance demands when application needs dictate.

New data classification and virtualization software and CAS appliances are helping users in all size shops start to bridge the gap between the promises and reality of tiered storage deployments. But other companies are still waiting on vendors such as CA, EMC and Symantec to integrate their software acquisitions of the last few years and deliver products that deliver enterprise-wide data classification and migration capabilities. w -With Gabrielle Wheeler and Jerome Wendt