BOSTON (05/23/2000) - "Abstract concept." That's about the best way to describe the process of designing and maintaining an effective e-commerce site today.
That's because no one knows exactly how to take what customers want and parlay it into how Web pages are designed. So leading e-commerce companies have learned how to make the process of Web site design and customer satisfaction a little less abstract by asking customers exactly what is and isn't working: Can they find product information quickly and easily? Was their order delivered on time?
To answer those questions, many organizations are turning to outside monitoring agencies. Such firms often provide a way for customers to give feedback about the design of a page, the overall buying experience or whether their orders were actually delivered on time, through both statistical and anecdotal means.
This is valuable information for companies. Not only does it alert them to potential problem areas, but it also gives them insight into the nature of those problems. Furthermore, the information can help drive the changes necessary to make a fix. Companies can then tell the customer that a change was made, closing the loop and hopefully inviting the customer, who has given them free advice, back to make another purchase.
Traditionally, companies have used focus groups to assess the pros and cons of site design and product selection as well as customer satisfaction with order fulfillment. But using focus groups to identify and correct a problem is a slow process. A Web site problem, once known, might only require a few minutes of coding to fix. So e-commerce retailers are supplementing focus groups with more immediate feedback.
Here's a look at how three companies use feedback data to drive e-commerce improvements.
STAPLES.COM:Making Metrics From AnecdotesE-commerce companies have an abundance of data. Site logs track where visitors go, which pages are most popular and how many customers return. E-commerce engine logs track repeat customers and the average price per order.
But translating that data into action items isn't so simple. "I think a lot of companies are overwhelmed" by the amount of data they collect, says Jackie Shoback, vice president of operations at Staples.com, the e-commerce arm of Staples Inc., the Framingham, Massachusetts-based office-supply retailer.
"We're fortunate that we're click-and-mortar," she says, because Staples has been analyzing customer satisfaction for years.
Staples.com subscribes to a service from BizRate.com in Los Angeles that lets customers give feedback about the site they're using. Fulfillment is a crucial e-commerce variable, and it's not something that necessarily shows up on a site log.
With BizRate Inc., a browser window pops up and asks customers if they'd like to rate the site from which they just purchased a product in the following areas: ease of ordering, ease of navigation, appearance, product selection, product information and pricing. Shoppers can also opt to take part in a follow-up survey. When the customer is expected to receive his shipment, BizRate sends an e-mail with a link to an online survey, where the customer can rate delivery time, product representation, customer support, privacy policies and shipping and handling. In both surveys, customers can offer additional anecdotal feedback.
"The actual raw data scores are great, but ... it's the comments, and drilling into those and further categorizing them and finding the common answers, where you can improve what you can do," says Shoback.
One member of the Staples usability team collects all customer comments and turns the anecdotal data into metrics. Those metrics are then categorized into function areas and forwarded with the anecdotes every Monday morning to a cross-functional team with members from information technology, marketing, usability and corporate areas. The team then holds a top-level review to flag which departments are affected by problems and to assign responsibility for making changes.
In a recent incident, the Staples.com site slowed considerably for people in one geographic area, and customers began writing in about the problem. The comments were immediately forwarded to IT, which took action. "It probably would have shown up when you run all your site statistics but was probably very small in aggregate. But when you actually started hearing from people and saw it," Staples was able to fix it right away," says Shoback.
While measuring customer satisfaction is a priority for retailers, a merchant with a physical, online and delivery presence has a lot of contact points to measure. In Staples' case, there are 29 distribution centers, each with its own fleet of trucks. Ordinarily, Staples would gauge its effectiveness only through internal metrics: Were goods delivered, and were they on time?
BizRate data gives Staples a more granular look at distribution processes.
Customer feedback is indexed against the relevant distribution center or driver. As with anecdotal feedback, distribution feedback is translated into metrics and pushed out to the appropriate departments when there are problems.
Shoback declined to quantify the overall customer response rate to the BizRate surveys, but she says it's both statistically significant and much higher than the 2 percent response rate Staples targets on its direct marketing campaigns.
Staples reads all customer feedback and responds to most of it.
WBStore.com: Gaining Online Customer InsightWarner Bros.' online store, Burbank, California-based WBStore.com, has been open for business for five years. But it wasn't until the 1998 holiday shopping season, when it did more business than management expected, that it made a serious effort to boost the revenue it earned. To do that, the company needed better site data. "We knew how many page views we were getting but that wasn't telling us enough about our site, especially as a retailer," says Bettina Sherick, marketing director for e-commerce.
Dave Clark joined WBStore.com as vice president of e-commerce the following April and started shopping around for third-party rating services. He eventually selected BizRate.
In November, WBStore.com launched BizRate to provide insight into customer satisfaction with its site and delivery processes. After it began using BizRate, it saw an immediate jump in feedback. From November to January, WBStore.com got 12,000 responses through the service.
Given the amount of data gathered and the need to act upon it quickly, Warner Bros. tapped one employee to spend much of her time taking the feedback and distilling comments for weekly steering meetings attended by Clark, a vice president of direct marketing and three Internet directors.
Customer feedback has driven many changes. Most involved combining different bits of information in any place it might be needed so users wouldn't have to hunt for it.
For example, when a significant portion of the thousands of responses gathered in a couple of months said customers had difficulty finding customer service phone numbers, a link to that information was added to every page. Other customers complained that the product-ordering pages contained children's clothing sizes but didn't include a definition of what "medium" really meant, so WBStore.com added that information. "I don't think you can go overboard in giving people what they want, unless it slows things down and makes a less-compelling experience," says Clark.
Clark says the demographics of the BizRate respondents so far largely mirror those of survey respondents who shop in Warner Bros. retail stores and of focus groups the site conducts.
The 150-plus brick-and-mortar Warner Bros. stores also conduct their own focus groups and share customer satisfaction surveys and buying-pattern information.
But although that information is useful and acted upon, it doesn't give WBStore.com the kind of immediate feedback it gets from using a real-time ratings service. In addition, Clark says he values the ability to benchmark the site from month to month. "We just relaunched our site, and we're really anxious in a month or so to go in and look and see firsthand feedback from customers," he says.
MVP.COM: Rating Pages in Real Time
If you're not careful, you might miss it: a small "[+]" graphic in the upper-right-hand corner of every MVP.com Web page that occasionally spins into a "[-]" graphic. Scroll your mouse over it, and the graphic widens into boxes with numbers from one to five and the request to "Please rate this page."
It's a ratings service called OnlineOpinion, from Chicago-based OpinionLab LLC.
MVP.com, also in Chicago, subscribed and added the online opinion tag to each of its Web pages in time for the site's January launch. MVP.com is an online sporting goods, outdoor and fitness retailer founded by sports legends Michael Jordan, John Elway and Wayne Gretzky.
OpinionLab aggregates the data, provides MVP with online access to various reports and lets users set rating thresholds. If a page falls below a certain rating, someone can be e-mailed immediately.
"We weren't sure how willing people were going to be to submit ratings," says Ian Drury, chief technology officer at MVP.com. No problem: The company gets about 1,000 ratings per day on its pages. "These metrics give us unique insight into how our customers feel about each individual page. That's not something we can necessarily get from our e-commerce database," says Drury. The ease-of-use of the graphic, he surmises, has helped lead to relatively high penetration rates with customers, although he declined to quantify those results.
To keep the site scoring high with customers, MVP uses a two-stage approach.
First, it keeps an eye on statistical data to diagnose problems. Then, when something needs to be fixed, it relies upon customer comments to discern the problem and find a good solution. About 25 percent to 30 percent of customer responses include additional anecdotal comments.
Every Thursday night, the marketing, merchandising, customer service, fulfillment and technology departments receive full reports. The next afternoon, representatives from each department come to a weekly meeting, armed with action items for addressing problems. Changes made the week before and their subsequent impact on metrics are also examined to see if they were successes or failures. Action items, such as marketing planning, vertical e-mail campaigns or technology enhancements, are tracked on a rolling spreadsheet by Drury's office. Microsoft Corp.'s Project software is used to track the more complicated efforts.
Overall, the OpinionLab data helps MVP determine which part of the site to focus on first, says Drury. It has "really minimized our dependence upon usability tests, which, until you had tools like this, was the only way of getting such tangible feedback from customers." That kind of feedback helps MVP.com rapidly refine its site, says Drury. Without the anecdotal data, the company would have to monitor the clickstream data daily and make educated guesses about why customers were dropping out at certain points.
One of the changes recently made in response to user feedback was on the product selection page. If a user went to the golf section and then clicked on "clubs and wedges," he would be presented with a list of 15 to 20 product choices. By scrolling over each name, he would see an image of the product. But many customers didn't understand how the page worked and said so via OnlineOpinion.
In the revised version of the product selection page, MVP now displays images for each product listed. The impact was immediate: Ratings for product selection pages jumped more than a point on a five-point scale. Customer conversion - getting people to buy or buy again - also increased for those pages. "That's the kind of positive, tangible business impact that having access to the OnlineOpinion data can provide," Drury says. "It highlighted an area for improvement; we recognized it, made it, saw the impact and saw an increase in conversion and revenue."