Utility computing may be one of the hottest topics in the technology industry these days, but there is much work to be done before it will ever achieve widespread acceptance, a panel of industry experts agreed Wednesday.
Panel members said that part of the problem was the term "utility computing" itself, which could be used to denote a wide variety of different concepts. "If you ask 10 people what grid or utility computing is you'll get 14 or 15 answers from them," said Rob Gingell, chief technology officer of software vendor Cassatt. In reality, utility computing is "not so much a thing as it is a confluence of various trends," he said.
Those trends boiled down to a widespread move toward commodity hardware and the Linux operating system, as well as a shift away from writing applications that are dependent on any one specific deployment environment, he said.
That opinion was echoed by another panel member. "At some level the term 'utility computing' is a bad word because it focuses on ... things that can distract you," said Carl Kesselman, chief scientist with grid software vendor Univa.
But semantics are not the only challenges facing utility computing, panelists said. There are economic and technical challenges as well. Software companies and users, for example, do not agree on how to charge for an application that may run on five machines one minute, and only one system the next.
"The vendors today seem to want very much for the users to pay for the full price of that (software) license as if it ran on that machine in perpetuity," said Dan Kusnetzky, an analyst with research firm IDC.
The concept of writing applications so that they can be run on a wide number of different machines at different times, which is at the heart of today's utility computing products, has been around in various forms since the 1970's, Kusnetzky said. "If you think about it, utility computing is not new," he said. As recently as four years ago, similar concepts were being put forward using the term "application service provider," he said.
"There was something similar, five to ten years earlier than that, which was called the service bureau," he said.
And history may also work against utility computing, as IT managers recalled problems they may have had with these earlier computing models, Kusnetzky said.
Utility computing providers will need to not only overcome skepticism, but also a variety of new concerns, if the concept is to gain acceptance, Kusnetzky said. "There are sociological, technical regulatory and political issues, all of which tend to make this an interesting move, but it's not yet ready."
The fact that utility computing architectures can cross international borders creates new questions, for example. "If we outsource this to some country that has a different view of IP, who owns the work we did?" he asked.
"There are lots and lots of issues," Kesselman said, adding that the technology itself needs to mature. "If we look at what people are calling utility computing today.... it's pretty primitive," he said.
Still, the efficiency and cost benefits of utility computing will ultimately lead to many of these challenging being solved, at least within the enterprise. From there, the move to an outsourced "utility" model of computing will not be so great, he said. "As those things get sorted out within the enterprise boundary, now what I've done is I've laid down the ability to start outsourcing."