The utility computing promise

Tapping into compute resources with a simplicity equal to plugging a lamp into an outlet has been a goal of pervasive computing efforts from the start. Known as utility computing, the idea is to provide unlimited computing power and storage capacity that can be used and reallocated for any application -- and billed on a pay-per-use basis.

Already present in a variety of capacity-based pricing models, utility computing is poised to expand throughout the enterprise as various key technologies -- such as Web services, grid computing, and provisioning -- intersect. Growth of utility computing in the enterprise will deliver to the industry not only equal access to supercomputing resources, but also new revenue streams for commercial data centers, new application pricing models based on metered use, and an open computing infrastructure for companies with little or no standing IT maintenance budget.

"Companies can begin to look at computing as a utility right now by virtue of the fact that utilitylike services are already available to them, including capacity-based pricing models for overhead infrastructure that is placed onsite," explains Bill Martorelli, vice president of enterprise services at Hurwitz Group Inc. in Framingham, Mass.

Martorelli points to the wide range of utilitylike service offerings from companies such as IBM Corp., Hewlett-Packard Co., Sun Microsystems Inc., and Compaq Computer Corp., in which additional servers, storage devices, and printers are placed on site with customers. The customer is charged for the gear only when it is turned on and used.

As something of a first step toward the vision of utility computing, IBM recently penned a US$4 billion deal with American Express Co. to provide all of the financial service company's technology infrastructure as a utility, managed and maintained by IBM. American Express expects to save hundreds of millions of dollars during the life of the seven-year contract.

Dev Mukherjee, vice president of strategy for e-business on demand at IBM in Armonk, N.Y., says that for American Express, Big Blue's compute utility service is "less a technology thought than a liberating thought," and that the American Express deal is only the beginning of an industrywide trend.

"I would describe utility computing as a wholesale shift in the computing industry," Mukherjee says.

"We are moving to where we will have a gridlike compute environment where all these capabilities -- whether they are infrastructure capabilities like storage or databases or special equipment that allows you to do vector processing -- will all be available in the grid, and anything a customer needs they will be able to get on demand across the grid," Mukherjee adds.

Ingredients in the mix

Utility computing on a global scale will require the continued evolution and convergence of core technologies spanning Web services, grid computing, broadband, storage virtualization, automatic provisioning, change management, and security.

Martorelli says early utility computing efforts, such as the American Express/IBM relationship, will begin as in-house projects and follow the same pattern of emergence as the core technologies.

Jeff Gilliam, West Coast region president of Electronic Data Systems (EDS) in San Ramon, Calif., says he is beginning to see increased demand for utilitylike computing technologies inside companies struggling to cut IT costs and consolidate resources.

"A lot of companies are looking at utility computing models, even if they are just looking for a more accurate charge-back mechanism for billing their own departments for the compute power they are using," Gilliam says. EDS recently licensed MicroMeasure utility computing billing software from Mountain View, Calif.-based utility computing technology company Ejasent. With MicroMeasure, EDS customers can begin to shape their business processes into a more utilitylike model while getting better control of distributed resources within their own network.

Of the emerging technologies that will support extended utility computing between companies, Web services will likely play the largest role, says Yogen Patel, vice president of marketing and product management at Ejasent.

"I think Web services will be the primary driver of utility computing," Patel says. "When you think of Web services, it is going to be an electronic cloud where there will be machine-to-machine interaction without human intervention. And if you think about the computing fabric that's required to support the massive amount of processing that is going to be needed to enable Web services, it's going to be a utility model."

Hewlett-Packard has been advancing its UDC (Utility Data Center) initiative for several years, and is currently focused on auto-provisioning technology that will enable utility computing, according to Nick Van Der Zweep, director of marketing at HP's Always On infrastructure division, headquartered in Palo Alto, Calif.

Compaq is also working on dynamic reallocation of resources as part of its Compute On Demand Initiative introduced last July.

"We're working on what we call 'dynamic reallocation of datacenters,' which is the ability to move applications and operating systems between the individual hardware boxes," says Joe Hogan, worldwide managing principal for Compaq's global services and outsourcing, based in Houston.

This type of work is vital to ensuring that servers and storage devices are properly flushed out before hosting another company's job.

"You have to make sure your data is secure and is not getting into the hands of the next customer that uses the infrastructure," HP's Van Der Zweep explains. "It's one thing to provision something and turn it on, but when you deprovision it you have to clean it all up, make sure there are no security loopholes left behind."

Efforts to enable automatic provisioning will be shored up as software companies such as Tivoli and Computer Associates begin creating application topologies for their products that can be updated and changed in real time. Progress being made by companies such as Avaki and Platform Computing to forge grid computing standards will also extend the reach of utility computing fabrics that drive an economy of Web services, Martorelli says.

Planned utilitarianism

As for possible bottlenecks in a global compute utility, officials at Palo Alto, Calif.-based Sun see broadband technology as less of a worry than secure provisioning or advancement of Web services.

"Datacenters that are linked across the world from one another are almost becoming a nonissue in terms of cost compared to other components of an infrastructure," says Chris Kruell, group marketing manger of enterprise systems products at Sun.

Although utility computing on a nationwide or global level is still many years out, companies can begin laying the groundwork in their computer networks that prepare their organizations to more easily integrate into a utility computing environment.

A recent study on utility computing by Forrester Research in Cambridge, Mass., suggests IT buyers begin looking at networkable servers and storage devices that will easily attach to future utility computing fabrics. For example, the report states that direct attached storage arrays "are cheaper than networked storage today, but will be hard to attach to the fabric years from now."

For now, time seems to be on the side of IT. Estimates for the arrival of global utility computing extend out as far as 10 years with some analysts. Although component technologies such as Web services and automated provisioning will mature far more quickly, experts believe the real hurdle will be cross-vendor cooperation.

"It's safe to say we are still at the early adopter stage of utility computing," says Compaq's Hogan. "But the promise is here already in the ability to dial up and down compute resources, and in the current economy that's important."

Beginning the utility computing discussionUtility computing is certainly on the minds of network administrators and boardroom executives alike, but not all companies are rushing to adopt it.

Although utility computing promises a way to cut technology management expenses while consolidating resources, many enterprise companies are planning relatively short-term project budgets, and hesitating to spend past the known horizon.

John Hutchins, a storage network manager at ZurichInsurance in Chicago, says his company has looked at utility computing models for its storage capacity, but that the insurer is still buying storage almost exclusively on a "project-to-project basis."

"If we only need 8TB of capacity, there is no reason for us to have [an additional] 8TB laying around," Hutchins says, adding "The state of [utility computing] technology is too immature to have the capabilities we need."

Kenneth English, a systems engineer at J.P. Morgan in New York, believes technology has a lot of catching up to do before utility computing will become a reality for a company with as many computing needs as J.P. Morgan.

"We're too diverse," English says. "We're too dynamic a company for utility computing as it stands now."

Executives at United Airlines, however, are talking about "utility computing" when discussing how to better manage technology, says David Belch, a senior systems engineer at United in Chicago.

United already outsources components of its technology infrastructure -- such as its network switching, which is taken care of by Galileo Technology -- and United's servers and storage may not be far behind, Belch says. "We're already moving to SANs on our Chicago campuses, and we're already talking the language of [hardware] consolidation.

"There are a lot of economics to it, but [utility computing] is a discussion we're having in the boardroom," Belch adds.

Jeff Gilliam, West Coast region president of EDS, a technology service company in San Ramon, Calif., thinks adoption of utility computing models will continue to be somewhat slow until more companies jump on the bandwagon.

"Uptake has been slow because there is still not a lot of utility computing out there, so [enterprises] don't know how to benchmark it," Gilliam says.

Ed Cowger, a research director at Gartner in Stamford, Conn., echoes these concerns, noting that with the current maturity level of the technology, embracing utility computing comes down to a long-term vs. short-term view.

"The basic problem with utility computing right now is that the No. 1 concern for IT right now is, 'How do I lower costs, how do I reduce costs?' And utility computing, while ultimately it may lower cost, it's not going to do it initially," Cowger explains.

Join the newsletter!

Or
Error: Please check your email address.

More about American Express AustraliaAvakiCA TechnologiesCompaqCompaqEDS AustraliaElectronic Data SystemsForrester ResearchGalileoGartnerHewlett-Packard AustraliaHurwitz GroupIBM AustraliaPlatform ComputingProvisionTivoliUnited Airlines

Show Comments