Achieving meaningful storage management metrics

Years ago, when IT folk and vendors still bandied about the term MIPS, an over-repeated witticism of the time was that the acronym actually stood for "meaningless indication of processor speed."

Effectively meeting business demands through service-level agreements, driving efficiency and addressing corporate governance policies require relevant metrics. These key performance indicators (KPI) should provide a meaningful aggregation of lower-level data to enable informed planning and decision making at senior levels.

Determining this information is not easy. Identifying appropriate metrics that truly provide the right insights and then assembling the necessary data from the disparate sources within the storage mosaic is a major endeavor.

Here are three examples of storage metric challenges:

Cost per gigabyte: Cost modeling is an arcane science. We have seen (and even developed) some extremely elaborate models, but effective cost models need not be overly complex to measure what is needed for a particular environment. They must be developed in conjunction with the finance and purchasing functions within the organization, and they must accurately reflect all significant operational and technical costs. Administration and management, port and bandwidth requirements, and data-policy factors can dwarf the cost of spinning disk.

Backup success rate: Knowing that last night's backups succeeded is important -- for a backup administrator. But it doesn't go far enough. From a business perspective, what is critical to know, and all too often unreported, is the overall recoverability status of an application. Deriving this information requires more than the backup success rate. It demands an understanding of application interdependencies, application-to-server mapping and integration of other data-protection components, such as snapshots, split mirrors and replicated volumes.

Terabytes per storage administrator: This is one of my storage management pet peeves. Wouldn't it be great to be able to boil down staffing requirements a single metric? Unfortunately, you can't. Storage staffing is largely a function of complexity. Factors such as variety of technologies, the number of supported applications or the volume of provisioning requests far outweigh the quantity of storage being managed.

Storage KPIs should address three areas: operational efficiency, risk and user satisfaction. A single tool can't (yet) provide what is needed, but investing some effort in determining what is needed and figuring out how to assemble it can pay big dividends.

Jim Damoulakis is chief technology officer of GlassHouse Technologies, a leading provider of independent storage services. He can be reached at

Join the newsletter!

Error: Please check your email address.

More about GigabyteSpeed

Show Comments

Market Place