Memo to the CEO

SAN FRANCISCO (07/05/2000) - Dear Colin:

Read your note re: the status of Macworld benchmarks. Frankly, I've been a bit reluctant to respond because, as the old joke begins, I've got some good news and some bad news.

First, the bad news. As you know, Macworld has been using MacBench as our baseline Mac testing tool, primarily because it quickly delivers accurate, repeatable results. That our readers can also get a copy of MacBench and test their own Macs' speed is an added benefit.

MacBench was great in a world where multiple vendors all sold variations of one Mac platform. Unfortunately, we no longer live in that world.

MacBench Broke

Late last year, the arrival of the PowerPC G4 broke MacBench-the tool's internal algorithms were not designed to take the Velocity Engine into account, so test values returned looked exactly like the G3's (without the Velocity Engine, the G4 is basically a G3). So we studied what it would take to get MacBench to the point where its results would be relevant for all Macs. Here's what we found:

A complete rewrite of the CPU test and all CPU-dependent tests would be required to support the G4. As part of the update, we'd have to decide how sensitive the new test should be to the Velocity Engine. After all, not all Mac applications are currently Velocity Engine savvy-and some key applications will never be able to take advantage of the G4 subprocessor.

Multiprocessing is also a challenge for MacBench-a single-processor test. This hasn't been a problem because multiprocessing Macs have never been a major factor on the platform. It's clear now that will change this year. The challenge, then, is creating a CPU test that can gauge discrete G3 and G4 speeds and multiple G4 speeds-all on not one but two operating systems, Mac OS 9.X and Mac OS X.

On top of all this, if we go to the trouble to do a ground-up rewrite of MacBench, could we get it to deliver test results that make sense relative not only to other MacBench numbers, but also to a user's experience?

After many conversations with Apple Computer Inc., and much research and debate in the Lab, our answer was an emphatic no.

Experience the Difference

Then, it came to us. Speedmark, first developed four years ago, had been Macworld's suite of application-based tests that delivered a single number representing the performance of a system. Because it was based on applications, the results more accurately reflected a user's experience with a particular Mac model-the new Speedmark 2.0 tests things all users care about, including network, Finder, and game speeds. Yes, the difference between a 350MHz iMac and a 400MHz iMac won't look nearly as dramatic using Speedmark as it does using MacBench, but in the real world end users wouldn't see that dramatic a difference either.

And, because it's based on common tasks, it's processor and OS independent. To the degree that the applications and tasks it includes work with the Velocity Engine, Mac OS X, or multiple G4s, Speedmark will deliver an accurate number-both relative to other Speedmark numbers and to the experience of the user.

Real-World Testing

With this in mind, we've decided to drop further development of MacBench.

Instead, we've developed a new version of Speedmark with an up-to-date suite of application and OS tasks that can accurately profile the performance of any Macintosh (complemented by the appropriate individual application tests).

Accordingly, we will standardize all system testing on Speedmark 2.0 as of the August 2000 issue.

With Speedmark 2.0, Macworld enters a new era of real-world benchmarks, where the results in print reflect real user experiences.

After all, how important is processor clock rate in measuring system performance? There are lots of reasons why a PowerPC clocked at a slower speed than an Intel chip runs faster: OS architecture, logic board design, I/O speed.

A fast hard drive will affect the perceived speed of most computers much more than a 50MHz delta in clock rate. Now we'll have a way of factoring all these issues into a single number. And that will be the most useful benchmark for our readers.

The only thing remaining is to figure out how to explain this to our readers.

An ad campaign? Expert testimonials? A "Try Speedmark" pavilion at Macworld Expo? I suppose I could always just publish this memo in the magazine.

But that would be too easy, right?

Benchmark Andy's column. Send your comments to visionthing@macworld.com.

Join the newsletter!

Error: Please check your email address.

More about Apple ComputerDeltaIntel

Show Comments

Market Place