IBM Corp. last week said it is investing US$4 billion in grid computing, a move that could help extend this technique for exploiting the power of thousands of distributed computers beyond its scientific and university roots and make it practical for all sorts of businesses.
The idea behind grid computing is to deliver computing and storage resources much like utility companies deliver power, eliminating or greatly reducing the need for customers to buy their own massive computing systems.
The technology - enabled by sophisticated clustering and supported over the Internet - is seen as being particularly useful for handling huge computer projects that organizations might only need to carry out once in a while. The technology is already used for a variety of applications, ranging from weather forecasting to aerospace design.
"The state of Mississippi bought a supercomputer for all the companies located there a few years ago," says Annie McFarland, a Clipper Group analyst.
"They did that because they knew the companies in the state couldn't afford to buy the kind of computers they needed to do oil and gas exploration and visualization. Sharing the computing cycles makes a lot of sense for the types of applications you have in pharmacology, energy and the automotive industry," she adds.
IBM's $4 billion will go toward building 50 data centers around the world, some of which will support the U.K. National Grid, a consortium of networked computers that involves eight universities and has a budget of $25 million.
IBM is one of just several big names in computing getting behind grid computing. Others include Sun Microsystems Inc., which last month announced it would donate its distributed computing Grid Engine software to the open source community.
Hewlett-Packard Co. also says it will make available its Coolbase software, which lets users share a variety of computing devices over the Internet, to the open source community.
IBM is working with an open source consortium called Globus (www.globus.org) to define and identify security and access protocols, bandwidth and latency issues, and routing and switching technologies that separate environments within grids.
This work could help address some of the concerns companies are likely to have about grid computing.
"They fear losing control, of something going wrong and not knowing who is responsible," McFarland says.
"But one area where the Globus protocols are most effective is in policy management, quality of service and authentication," she says.
While grid computing technology someday might work its way into corporate networks, most companies that choose to take advantage of it will do so on an outsourced basis, observers say.
"Industrial projects so far at companies such as those at Glaxo Smith Kline or British Aerospace are not looking at implementing the software in house right now, although in a couple of years they will," says Tony Hey, architect for the U.K. National Grid.
"Now they are participating by giving manpower, cash, hardware or software licenses [to projects like ours]," he adds.