Two paths for the future of computing

It was one of those days when you wonder if you're spending too much time on the conference circuit. The topic seemed important enough: the future of software licensing. But while the vendor panelists were all sufficiently rehearsed, it was obvious that they were struggling to find something new to say.

The lady from Sun predicted that the emergence of a true open-source software stack would eliminate a huge chunk of today's licensing costs. The gentleman from Inc. argued that once software becomes a utilitylike service, many of today's software management problems will simply disappear. Inevitably, the chap from Computer Associates added that because customers like choice, a variety of economic models will surely be required.

All fair enough, but all things you've probably heard before. However, as I started to doodle and examine the ceiling panels, it occurred to me that these themes are actually much more in conflict than they initially appear. While open-source software and utility computing have become the two most heavily promoted trends in IT architecture, I hadn't previously realized how fundamentally at odds these ideas really are.

Consider that the desire for open-source software stems largely from the beliefs that customers often aren't well served by closed, proprietary, single-vendor software and that making source code freely available can result in better, cheaper, more flexible products. In contrast, the argument for utility computing is based on the assumption that customers should be shielded from the underlying complexity of software whenever possible. Effective businesses should use applications, not tinker with programs.

These two philosophies suggest sharply different future scenarios. The open-source community believes that by banding together, the world's programmers can break the hold suppliers have traditionally had over the IT business, greatly reducing both software licensing and switching costs. Utility computing presents a much more vendor-friendly vision, where suppliers are largely responsible for technology development and delivery.

Stated bluntly, in a services-driven environment, there would be little real need for the open-source endeavor. Vendors would be able to afford expensive development teams and proprietary innovation, since the cost of this work could be spread across a large number of customers. Conversely, the more the open-source model succeeds, the more likely it is that customers will keep their IT work in-house, to the clear detriment of utility-style providers.

Of course, it's hardly surprising that, other than Microsoft, virtually all of today's leading IT vendors -- IBM, Sun, Hewlett-Packard, Oracle -- enthusiastically support both models. Naturally, they want to shape each approach to their advantage. Thus, as a rule, they support open-source initiatives in those areas where they don't have a strong market position and promote utility computing where they do. Why would they behave otherwise?

Consequently, the real question is which of these two visions IT customers will find more attractive. In the end, neither will be an absolute winner, but the market impact of each will likely vary depending on the time frame. Over the next few years, the open-source community will prove to be the much more vital force, and the utility/grid concept will be criticized for its excessive supplier hype. However, a decade from now, the services model might well dominate, with open-source activity looking increasingly like a legacy domain.

Sometimes the best thing about a conference is simply the time you get to think.

- David Moschella's latest book is Customer-Driven IT: How Users Are Shaping Technology Industry Growth.

Join the newsletter!

Error: Please check your email address.

More about CA TechnologiesHewlett-Packard AustraliaIBM

Show Comments

Market Place