Benchmarks guide purchasing decisions for many IT customers while also instigating arguments among vendors and industry observers. However, they remain an important tool in the corporate IT world for comparing system performance from vendor to vendor, and one of the most influential benchmarking organizations in the industry is preparing to update its specifications to reflect current usage models.
IT managers who consult the Transaction Processing Performance Council's (TPC's) benchmarks will have an updated metric to use when selecting servers later this year, according to representatives of the TPC.
The council is preparing to release a major revision to the TPC-W benchmark for Web servers, one- to eight-processor machines that generate Web pages and process transactions over the Internet, said Michael Majdalany, administrator of the TPC. It is currently undergoing a public review, and is scheduled to be released by the end of the year, he said.
Council members are also reviewing changes to the price/performance figures that are hotly contested by vendors and scrutinized by customers. The TPC wants customers to be able to compare price/performance results across the major TPC benchmarks, something they are currently unable to do, Majdalany said.
The TPC is made up of 21 full members including Advanced Micro Devices, BEA Systems, Dell, Hewlett-Packard, IBM, Intel, Microsoft, Oracle, Sun Microsystems and Unisys. About 40 organizations in total belong to the group, including a number of market research firms and user groups.
Four major benchmarks are currently administered by the TPC. TPC-C measures online transaction processing in large multiprocessor servers, and is one of the most widely cited benchmarks in vendor marketing materials. In addition to TPC-W, the council also sets guidelines for TPC-H and TPC-R. TPC-H provides decision support for environments that require ad hoc queries to a database, and TPC-R also provides decision support but for environments where the queries are known ahead of time.
Because of the potential marketing advantage of superior benchmark results, the TPC requires that all results submitted to the council be independently audited, said Michael Molloy, current chairman of the TPC and a senior manager with Dell.
Any company that wishes to publish a TPC benchmarking result must have that result audited to make sure all of the hardware and software used is publically available, and not a "benchmark special," Molloy said. A detailed configuration of all the equipment used to produce the result must be published on the TPC's Web site (http://www.tpc.org) along with the result.
The integrity of the benchmarking results are maintained by the auditors and the ability of the council members to dispute results, Molloy said. Any aspect of the result can be challenged, and voted on by the full council.
No one is under the illusion that benchmarks are perfect, but it is beneficial to have some metric with which to draw conclusions about various equipment, said Gordon Haff, senior analyst with Illuminata.
"Whatever the flaws of benchmarks, there is a value to having at least some relevant cross-industry comparisons," he said.
Those comparisons often help narrow down a company's options when selecting a new vendor, in order to make sure that vendor can deliver the performance required by a company's applications, said John Stevenson, vice president and chief information officer at Sharp Electronics.
"There may be moments where you use benchmarks to narrow down the field. You might be worried if (that vendor) is trailing, unless they are the incumbent," Stevenson said. A strong existing relationship with a vendor can offset less impressive benchmark results as long as those results meet a company's base performance requirements and the vendor has a history of good service and support, he said.