A major paradigm shift that could ultimately redefine the current world order of the industry is slowly but surely working its way through the infrastructure of enterprise computing.
In the first few years of the Internet era, the clear winners have been IBM, Microsoft, Sun Microsystems, and Oracle. IBM probably gained the most by riding the Internet back to being relevant, while both Microsoft and Oracle ably reinvented themselves to remain major players following the end of the client/server era. But as things stand now, we have only completed the first wave of the Internet era. And a more tumultuous second wave has the potential to lift other companies more than IBM, Microsoft, and Oracle, which will always be relevant but perhaps not quite as dominant.
The second wave of Internet computing will be driven by the continued expansion of distributed-computing architectures. If you look at the first wave of Internet computing, the common theme was the rise of the server as a central location from which to give thousands of clients access to data.
But as we move forward, the next wave will be driven by the need to get data as quickly as possible to its intended destination. We have already seen the first substantiation of this wave with the rise of SANs. Instead of tying storage to specific servers, we now allow a range of servers to access the same shared storage resources. And now we are beginning to see companies such as EMC develop software architectures that will eventually allow some classes of applications, such as BI (business intelligence), to move from the server to the SAN itself. After all, the SAN is where the data is actually stored. We will soon see companies giving partners and suppliers direct access to their SANs through the firewall to facilitate real-time collaborative commerce applications. Needless to say, as this trend continues to develop, the relative value of companies providing storage will eclipse companies that focus only on storage. This is the very reason that IBM, Compaq Computer, and Dell Computer are currently obsessed with EMC.
But the next wave of Internet computing promises to go even further than that. SANs have pretty much virtualised physical storage assets. The next big thing is to virtualise the data itself. XML takes us a big step closer to accomplishing that because it is a neutral, self-describing data format. But XML probably will represent less than 2 per cent of all the data that exists in applications today, so clearly something else will be needed.
That something is unclear, but three possibilities are being bandied about. The first relates the likes of Veritas Software, which has developed data management tools that could be adopted to create a virtual layer for accessing and managing data. A second approach involves content management companies such as Vignette and Documentum or startups such as Noetix, which provide an application that can manage data in other applications. Finally, there are startup hardware companies such as Storigen Systems, which is developing a content management capability that resides directly in its SAN.
All of these tools have the potential to de-emphasise the role of the database in managing data. After all, if the data is located at the edge of the network rather than at its core in the server, then data repositories are not quite as important as applications that can truly work with distributed data.
More importantly, a lot of that intelligence associated with managing data increasingly will be embedded in the network hardware itself. So rather than running application integration tools on servers, we'll eventually see that requirement fulfilled at the edge of the network by next-generation intelligent routers.
It's too early to say what will ultimately happen, but anything is possible as the next wave of Internet computing develops.