In the spring of 1999, Ameritrade Inc. ran one of the slowest brokerage sites on the Web. Today, the company consistently ranks among the five fastest Web sites for executing stock trades. For the past 34 weeks, Ameritrade has also recorded the quickest home page download times among consumer Web sites.
What happened 18 months ago was Ameritrade made Web site performance a focal point for its IT department, investing "substantial" sums in infrastructure and testing tools. As a result, Ameritrade's user base has soared from 400,000 to 1.4 million.
"We pay very close attention to Web performance statistics," says Jim Ditmore, chief information officer of Ameritrade. "Your reliability and your performance are the key underlying components of your customer's experience. You want to have great customer service and features on your site, but if it takes forever to load, no one will use them."
Ameritrade is not alone. Most successful e-businesses measure and monitor every aspect of their Web site's performance, accuracy and reliability. IT managers look at the performance statistics on a daily - even hourly - basis to quickly identify and solve performance problems.
Outside the IT department, Web site performance reports are reviewed by marketing managers, product line managers, operations executives, CEOs and investors. These nontechnical observers care about two metrics: speed and reliability. They want to see constant improvement in both areas.
"Uptime. Page load time. Transaction time. These are CEO-level statistics," says Bruce Weiner, CEO of Eblast Ventures, a Chicago firm that has built 14 Web sites since its launch in April. "Our CEOs are looking at these numbers daily."
Web site performance statistics are "an indication of how well we are doing operationally from an IT perspective," Ditmore says, noting Ameritrade has found that having one of the Web's fastest sites has helped recruit and retain top engineering talent.
Many e-businesses such as Ameritrade employ two types of Web site performance testing tools:
Internal testing tools that let IT departments measure the speed, accuracy and scalability of their Web sites prior to launch. With these tools, Web site developers set up test scripts that mimic how a user will conduct a transaction. Then they bombard the site with these simulated users to see how much traffic the site can handle. The tools, offered by Mercury Interactive, RadView, Agilent and others, help developers locate bottlenecks.
External measurement services from Keynote, Freshwater and Mercury provide continual monitoring of a Web site's speed and availability after it is launched. These companies have servers located on different Internet backbones around the world that send transactions to a Web site on a regular basis, often every 15 minutes. The services track response times and provide daily reports and real-time alerting if problems are found.
Corporate interest in both kinds of tools is high, says Bill Gassman, a senior research analyst at Gartner Group, who estimates the market for tools that measure Web application response time will reach US$150 million this year. "It's the combination of these two kinds of tools that really makes sense," he says.
In the past year, customers have deployed tools that look at end users' online experience rather than measuring page download and refresh times. The key metrics are how long it takes to complete a transaction - such as buying a book or trading a stock - and how often that transaction is available.
"Our primary focus is on the end-user experience," says Deanna Kosaraju, director of the application product line at BitLocker, a Web site that lets users create hosted databases. "We need to make sure that we understand from an end-to-end perspective what the user is going to experience, and we need to make sure that [performance] stays consistent."
Kosaraju tracks statistics such as how long it takes a person to sign up for an account, add a record and review data. "These metrics are much more important than making the home page download faster," she says.
Tracking the user experience becomes more difficult as Web sites get more complex. A typical e-commerce transaction involves a user visiting a home page, finding and ordering a product, using a third-party service for credit card verification, getting confirmation of the order, having the order entered into an enterprise resource planning system and shipping the item. Web site operators need to track the speed, accuracy and reliability of all steps in this process to understand the end user's experience.
For many Web sites, measuring transaction times often goes hand in hand with redesign efforts. For example, Fedex.com recently redesigned its Web site to speed the time it takes users to ship and track packages.
Claire Ruddy, manager of Fedex.com marketing, says the company eliminated as many graphics as possible, and began using Akamai's off-site caching service to improve the site's performance. Together, the two moves have resulted in page refresh times dropping from an average of 5 seconds to 1.5 seconds.
"Based on what our customers tell us . . . the No. 1 most important feature of our site is speed," she says, adding that Fedex.com made its first appearance on Keynote's five fastest business Web sites list in September. In addition to the Keynote service, Fedex.com uses its own server-based tools to measure response times from its back-end systems.
But speed measurements aren't enough, Ruddy says. Fedex.com also uses network management tools to ensure its Web site is up 99.999 percent of the time. Fedex.com runs regular usability tests to confirm its customers can navigate through the site and find what they want.
"As part of the redesign, we brought a lot of features to the home page that had been buried three clicks in," Ruddy says. "Our carrier site was two to three clicks down. Now it's on the home page, and we've seen a 300 percent increase in its usage."
Ruddy recommends a balanced approach to Web site design. "You don't want a site that's lightening fast, but customers can't find what they're looking for," she says.
As useful as today's Web site performance monitoring tools are, there's room for improvement. Web sites are looking for more detailed diagnostic information to isolate problems as well as automated tools for fixing them.
"The Holy Grail in this area is a tool that tells you not only that the site is slowing down, but what the problem is," Gartner Group's Gassman says. "Is it the SCSI controller? Is it the [Domain Name System] server? Is it the middleware? Nobody has that capability yet."
Gassman says the most promising approach is the use of agents in browsers and Web applications that automatically measure performance from the user perspective. These agents would integrate with network management platforms and policy engines that can fix problems once they are detected.