The power of the grid

More power. It's a concept that appeals to just about every technology enthusiast. Thanks to continuous advances, we hold fast to the promise of faster processing, more storage, and unlimited information access from any device. And so we turn to grid computing. Similar in concept to power plant grids that channel electricity where it's needed, grid computing aims to deliver data processing resources including CPUs, storage, and applications across nations and continents.

We're talking about serious computing here. The scientific community and organizations such as CERN (the European Organization for Nuclear Research, which gave birth to the Internet) are working on ways of processing literally petabytes of data.

And supercomputing vendors such as IBM, Sun Microsystems, Hewlett-Packard, Compaq Computer, Silicon Graphics, and Cray have woken up to the idea. IBM started its Grid Computing Initiative last year, and Sun recently announced its Technical Compute Portal offering, comprised of the iPlanet Portal, Sun's Grid Engine, and Sun ONE (Open Net Environment).

The appeal of grid computing to these vendors goes beyond the opportunity to ship big iron. The big question still floating around the scientific community is, "How much processing power is enough?" A popular line of thought suggests Moore's Law cannot keep pace with the need of large enterprises for more CPU cycles, unlimited storage, and boundlessly distributed applications.

Where business is concerned, real working examples are hard to find. But it's not hard to imagine that in the future it will become increasingly difficult for existing supercomputers to analyze data contained in vast databases spread across many geographical areas.

Technologies such as UDDI (Universal Description, Discovery, and Integration) and XML can help, but if grids are to be successful they must be easy to use and devoid of ad hoc programming. Interestingly enough, Oracle told us last week they want to develop "the database for the grid." Today, Oracle's biggest databases handle between 30TB and 40TB of data, but executives are predicting huge growth rates.

If it all sounds too far off, start thinking on a smaller scale. Grid computing can become the logical evolution (and consolidation) of multiprocessor servers and computer clusters, tied by technologies such as Infiniband. Take a second look and you will see an architecture similar to a computing grid that can flexibly allocate CPU cycles and terabytes to your applications.

The additional power and flexibility that enterprises need are there, but we have to learn how to exploit them before the future catches up with us.

E-mail us at mario_apicella@infoworld.com and mark_jones@infoworld.com.

Join the newsletter!

Or

Sign up to gain exclusive access to email subscriptions, event invitations, competitions, giveaways, and much more.

Membership is free, and your security and privacy remain protected. View our privacy policy before signing up.

Error: Please check your email address.

More about CERNCompaqCompaqHewlett-Packard AustraliaIBM AustraliaiPlanetSilicon GraphicsUDDI

Show Comments