Despite advances in tiered storage technology, many IT managers say they have no idea of the value of their companies' data and can't manage it in any automated way.
For example, Laura Fucci, chief technology officer at the Las Vegas-based MGM Mirage hotel and casino chain, said her department has implemented a tiered storage infrastructure for its 180TB of data. Nevertheless, the company is still trying to better manage its storage, she said.
Fucci was among several speakers at the Storage Decisions 2005 conference who spoke about information life-cycle management and tiered storage.
"One problem we have at MGM Mirage is we don't have an articulated storage [management] policy. We're going to tackle that next year," Fucci said. "We're formulating an enterprise-wide document retention policy, which defines the retention for various levels of documents and included in that is our existing policy for how we handle sensitive information, like credit card data."
MGM Mirage is in the process of implementing Symantec's Veritas Enterprise Vault software to archive e-mail and plans to do the same thing for the company's file systems next year.
"Information just keeps growing. Our demand for storage keeps growing. I'm not sure if we're ever ahead of the storage problem, but we're going to do something to keep up," Fucci said.
Of roughly 250 users polled by conference organizers, 51% said they have no way of determining the cost of storing data over time. Another 47% said they have a tiered storage model and some idea of storage costs but no way to automatically migrate data between tiers. Only 7% said they can definitely determine the value of their data.
Gary Schwimmer, a data center operations manager at Los Angeles-based Northrop Grumman, said his company has developed a data retention policy that involves tagging data using the Standard Generalized Markup Language to determine what to move and when to move it.
But Schwimmer said that migrating data from one tier to another is still a manual process that's prompted by an automated e-mail notification system developed in-house.
The IT managers said another big issue is finding ways to ensure that data is deleted at the end of its useful life. While some said they delete everything after a set period of time, others said their data often sits in external storage vaults, requiring the payment of fees and a migration to newer tape technology over time.
Schwimmer said Northrop Grumman's data deletion policy requires that everything go after 10 years. But, he added, "we're struggling like everyone else. The big part is convincing people it's going to [require] an investment to make things change."
Richard Scannell, a consultant at GlassHouse Technologies in Framingham, said IT managers can't afford not to begin deleting data. Even if the capacity of new storage systems doubles every 18 months, it will never be enough to keep up with data growth, he said. Statistics show that up to 74% of all data storage costs can be attributed to maintenance and administration of existing storage, he added.
Craig Taylor, associate director of open systems at Chicago Mercantile Exchange Holdings, said his group is working to determine how to classify data so migration policies can be created. Taylor's group has built an elaborate storage infrastructure with five tiers of data storage that include EMC's Symmetrix arrays, secondary disk storage systems from Copan Systems and tape libraries from Storage Technology, which was recently acquired by Sun Microsystems Inc.
Even so, noted Taylor, "do we have any physical deletion policy? No."