Managing outsourced data streams

At one time even customer-facing Web sites were nothing more than HTML brochures. As Web sites have become increasingly complex, more and more companies are turning to outsourced partners to handle services such as caching, content feeds, and secure credit card transactions.

But with each new partner introduced to a Web site, the chances of network problems, such as congestion or imbalanced loads, rises. And many network-management tools do not identify which partner is causing the trouble, so corrective action can be painstaking and expensive, notes Brian Baggett, a member of the Discovery Channel's UNIX engineering support group and a consultant with Austin, Texas-based infrastructure management company Collective Technologies Inc.

"Traditionally it's been either too difficult and/or too time-consuming to accurately simulate the experience the end-user has, and then it can take even longer to trace where potential problems may lie," explains Baggett.

"Developing the means to do it yourself just isn't practical when you have to devote resources to the more pressing endeavors of keeping the online environment going," he adds.

In response to this problem, more companies are turning to network-management applications that provide visibility into partners' network activities. This not only lets businesses figure out which partners are causing the network clogs, it can come in handy when measuring or investigating SLA compliance.

Discovery.com, the Discovery Channel's Web site, includes affiliated sites for brands such as Animal Planet and the Travel Channel, as well as a shopping site. It averages between 2.5 and 4 million page views per day. Some of the site's content is delivered by outsourced partners: caching by Akamai Technologies Inc., collocation services by Exodus Communications Inc., ad serving by RealMedia and DoubleClick Inc.

The sheer complexity of Discovery.com's partner relationships often makes network management difficult: Monitoring and managing those outsourced partners using existing technology is certainly possible, but too labor-intensive to be worth the effort, according to Baggett.

Discovery Channel turned to a Web-management suite from AperServ Technologies Inc. This suite generates granular data to determine what actions are occurring on a network. It identifies the root causes of problems, pinpoints which outsourcer is responsible, and assesses the performance effect.

In response to the demand for more detailed Web management tools are outsourcing-focused management applications, including offerings from companies such as AperServ and MetiLinx Inc., in San Mateo, Calif.

"[These applications] would be helpful for Internet companies that outsource content development and creation in an ASP-type model," says Jack Gruninger, CTO of Brainbench Inc., an online skills-certification provider in Chantilly, Va.

"I see this becoming more valuable for companies without big IT departments, or who are outsourcing everything. It's one thing if the network's down and you're not getting any response from 15 sites on the Internet. But if the network's up and you're not getting content, that's a problem," he adds.

Baggett agrees, noting that with the management applications, his team can now easily pin down culpable outsourcers if network latency, accessibility, or average load times suffer, or if issues take too long to be resolved. The technology's reporting function can automatically determine whether an outsourcer is complying with its SLA (service-level agreement) -- a process that, using standard methods, can be painfully lengthy.

"There's a lot of detail in SLAs that you don't empirically measure. You just hope that [the outsourcer] is good for their word," Gruninger explains. Brainbench currently uses outsourced network-management tools to determine whether its partners are meeting the response-time and data-throughput standards established in their SLAs.

"You don't take the time to make sure [outsourcers] are crossing their t's and dotting their i's. They provide a great service, normally, but even on small disruptions or small violations of the SLA, you don't have any raw data to say, 'Between these hours, this happened.' It would be far too expensive to do that on our own," he adds.

Baggett is particularly intrigued by the ability to ensure that Web sites maintain prespecified availability thresholds.

"If you have certain quality-of-service agreements with your backbone provider or with those companies that cache Web content, [the utility] can ensure that you're getting what you're paying for and provide you with evidence if this isn't the case that can be helpful towards ensuring that," he says.

The result is better overall site performance; something any company can appreciate.

"Just like any other business, we're always searching for ways to improve the end-user's experience," Baggett says. "And the data that we can now gather is very meaningful, as it's a useful benchmark to consider how reliable or speedy our Web sites are to our viewers."

Join the newsletter!

Error: Please check your email address.

More about Akamai TechnologiesAperServ TechnologiesBrainbenchCollective TechnologiesDoubleClick

Show Comments