No other storage topic is more sensitive for vendors and more important for potential customers than performance measurement. I've had vendors refuse to send me a product for review because of disagreements on which speed benchmarks to use during the evaluation.
Even when there is agreement on the tools used to measure performance, vendors may disagree on how to read those results and they'll dispute their relevance.
I won't take sides in that debate. It's rather obvious that those quarrels don't help buyers who are trying to figure out if either of the two solutions fits their requirements.
In December 2001, the Storage Performance Council announced SPC-1, the first benchmark designed specifically to measure storage performance.
SPC-1 and its logical complement, SPC-2, are designed to measure the performance of storage subsystems regardless of their connection to application servers, and to simulate typical business workloads. It's a gross simplification, but to put them in context, think of SPC-1 as a "random" benchmark and of SPC-2 as a "sequential" benchmark. You can learn more about the two benchmarks and read their specs here.
Why didn't the SPC initiative gather more followers? I'm not sure if things have changed since, but according to 2001 reports, disagreements over the randomness of SPC-1 drove at least one vendor, EMC, to leave the council.
Five years later, EMC is still keeping its distance. The council now has about 30 members, but that number sounds rather unimpressive considering the hundreds of vendors crowding today's storage market.
However, SPC has steadily (if not quickly) increased the number of published benchmarks and recently added an SPC-2 Toolkit, initially available for AIX, Solaris and Windows Server 2003, that should be available for purchase online any day now. The council is also working to produce additional benchmarks to measure the performance of basic components of a storage system including HBAs, disk drives, and applications such as logical volume managers.
Will these efforts attract more attention and followers to SPC? Perhaps, but as the number of published benchmarks grows, SPC should make those results more easily accessible. Right now, to assess how a storage system such as the Fujitsu Eternus fared in the SPC tests, you have to dig out those numbers from a PDF file. Not the friendliest nor the quickest way to find what you need.
Worse yet, the SPC website doesn't allow you to search those benchmarks, and doesn't have an option to download them to a spreadsheet -- an option that other benchmarks, including SPEC, have offered for quite some time.
"That implementation was a unanimous decision of the council to prevent inappropriate comparisons," SPC administrator, Walter E. Baker, said in an email exchange. Well, they were unanimously wrong, in my opinion, because that decision also makes it extremely difficult for customers to find solutions that meet their needs, which defeats the purpose of publishing performance numbers.
Baker, however, concedes that "the increasing number of SPC Results and increased use of those results require a more sophisticated, user-friendly method of selecting and accessing SPC Result information of interest". So by end of the quarter, Baker said, we should be able to search the results.
I will revisit the SPC site at the end of the first quarter to see if the council follows through.