Straight Talk About Windows 2000

BOSTON (06/15/2000) - WHY BANKS ARE E-COMMERCE CHALLENGED MARILYN R.

SEYMANN I'm not a CIO. I can't fix your PC, make your systems secure or write a program. But as the CEO of a bank consulting company who spends half her time engaged in techie talk with boards of directors and CEOs and the other half on boards approving--albeit skeptically--technology plans and budgets, I can offer perspective and insight that comes from years of experience.

E-commerce and the Internet are no longer the vague, mysterious entities they once were, and most bank CEOs and directors have finally come around to admitting that there is no longer a question of whether banks can move forward with e-commerce initiatives while managing short-term earnings pressures, ongoing acquisitions and limited resources.

By virtue of serving on the boards of the most regulated industry in America, bank directors traditionally evaluate risk based on the degree to which errors can be avoided and mistakes eliminated. Risk is managed with rigid operating procedures that allow minimal expectations, require thorough documentation and provide a careful review of errors. A technology investment that doesn't pan out is handled like a bad loan--correct it, and avoid it in the future.

How to balance risk with the Internet and e-commerce? Nonbanks vying for a piece of the financial services business operate under a completely different set of risk rules from banks. Regulation is lax on the Internet, and even bank regulators have taken a hands-off approach to e-commerce. Mistakes are not only tolerated, they are celebrated. (How many millionaires have resulted from public offerings of new companies that have never made money?) CAUTION TO THE WIND Risk is no longer about avoiding mistakes. Caution is no longer the name of the game. Missed opportunities and lost revenue streams result from an inability to enter a market fast enough.

Speed and risk have become the very foundation of all economic activity, yet bank CEOs are approaching the "e" world with deliberate caution, allowing their traditional risk aversion to paralyze progress. Impossible as it is to predict the shape of things to come, one thing is certain: Banks can no longer allow their overly cautious risk profiles to further distance them from industries that are vertically disintermediating them. Paradoxically, in an attempt to take only "safe" risks, banks are taking a huge risk, namely, losing out to forward-thinking competitors. Banks must learn from the dotcom world that zero risk and speed are incompatible.

One of the biggest challenges facing a bank CIO in the e-commerce world is convincing the board to change its traditional view of technology risk. Because the best decisions flow from knowledge, CIOs need to help their CEOs and directors readjust their thinking. There are many ways to do this, including creating IT white papers, establishing an R&D line item in technology budgets and encouraging directors to look outside the industry for examples of speed, customer focus and value creation; begin promoting the idea that e-commerce initiatives might require new departments or lines of business on the organizational chart, and perhaps even physical separation; provide specific examples of how this works. But above all, be patient. If your message doesn't get through the first time, repeat it. Today CIOs must help boards understand that the greatest risk posed by e-commerce is doing nothing at all.

Marilyn R. Seymann is president and CEO of M One, a Phoenix-based consulting company specializing in bank management and technology planning.

MICROSOFT'S STRENGTH BY WILLIAM F. ZACHMAN There were plenty of pundits dumping on Windows 2000 in the run-up to Microsoft Corp.'s formal launch this past February. At least one prominent IT-guru company did a major league "Chicken Little" act, warning clients to go slow on Windows 2000, implying that the sky would fall on early adopters. A Valentine's Day piece in The New York Times three days before the introduction of Windows 2000 was headlined, "Microsoft Faces Skeptical Market with Windows 2000." The company's share price slumped a week before the launch under the weight of all the bashing.

Microsoft bashing has lately been such a popular fad among the press and industry gurus that it may be a few more months before the truth will win out.

It is only a matter of time, however, before all but the most hard-core die-hard anti-Microsoft fanatics will be able to see, plain as the egg on their faces, that the professional pontificators who predicted a perilous, slow and rocky road for Windows 2000 acceptance were all wet. Not only will the move to Windows 2000 be remarkably swift, by any reasonable standard, but the impact of Windows 2000 on enterprise computing will be profound, far reaching and long lasting as well.

We are not in Kansas anymore, Toto! Corporate computing is about to undergo one of those major tectonic shifts that rearrange continents. The skeptics will be proved wrong as Microsoft's latest effort to provide industrial-strength platforms advances even deeper into the heart of corporate IT practice, to the growing discomfort of Microsoft's competitors. Windows 2000's gains will, necessarily, be losses for the more established high-end IT platform vendors, such as IBM Corp., Oracle Corp. and Sun Microsystems Inc.

The move beyond the standalone PCs began in the second half of the '80s, initially with truly local networks, which for the most part simply support file and print sharing. By the early '90s, as the technology matured and more capable and reliable servers appeared, client/server architectures became viable foundations for increasingly mission-critical applications.

Client/server applications on Intel platforms were initially used on more powerful, semiproprietary Unix-based platforms from more traditional vendors.

They have become a plausible option with OS/2 and Windows NT servers in the first half of the '90s.

THE POWER OF OLD ARCHITECTURE The use of Intel-architecture client/server options throughout the first half of the '90s was limited because both OS/2 and NT were still under development and lacked capabilities needed to fully support industrial-strength applications. The competitive edge for more complex high-reliability applications remained with Unix-related workstations and servers from vendors, including HP, IBM and Sun. Wintel alternatives, though they gained ground in the first half of the '90s, did little more than nibble away at the more traditional alternatives' territory.

By the mid-'90s, the explosion of the Internet generally found the older architectures more suitable for large-scale Web applications than the newer Intel-centric alternatives. The big websites that developed during the last half of the '90s were mostly built on traditional Unix server platforms. A Windows NT Server on Intel platforms, while increasingly popular for small- and midscale website hosting, remained an option of a minority through most of the decade.

With the introduction of Windows NT 4.0, though, things began to change during the second half of the '90s. Plagued by some early problems, NT4 initially got off to a relatively slow start. Those who recently have predicted an equally slow start for Windows 2000 have done so with the assumption that the NT4 experience will be repeated with Windows 2000. They have, however, failed to observe some important facts.

First, Microsoft clearly learned from its experience with NT4 that getting a new operating system right is more important than getting it out quickly. NT4, like NT 3.1 before it, was rushed to market to meet competitive challenges primarily from IBM's OS/2. In retrospect, Microsoft's decision to move more quickly to market with less thorough prerelease testing, while it may have slowed initial acceptance of NT, clearly accomplished its competitive objective regarding OS/2. Microsoft has, however, taken its time with Windows 2000, doing far more thorough testing than it did with any prior version of Windows.

Second, though linked to a 16-bit Windows legacy, NT was an almost entirely new operating system, all of whose parts needed to be checked out from scratch.

Prior versions of 16-bit Windows were not a foundation on which to build NT, but rather a compatibility problem to be dealt with. Windows 2000, however, is an enhanced, upgraded version of NT, built on an already quite stable NT foundation. Remember, Windows 2000 was originally named Windows NT 5.0 and therefore has a solid base that was not available in the earlier versions of NT.

Third, amid their negative consensus prior to the Windows 2000 launch, the press and pundits failed to note that Windows NT 4, after a slow start, took off like a rocket in the final two years of the '90s. Windows NT deployment, after laying down a long, flat growth curve for most of the decade, surged dramatically in 1998 with strong continued acceleration throughout 1999.

RAPID DEVELOPMENT Roughly from the time of Service Pack 4, which finally brought NT the function and stability it ought to have had in the first place, NT deployments really picked up their pace. By the end of 1999, Windows NT was on a tear in terms of corporate acceptance, going into 2000 with enormous momentum. That momentum will only accelerate further with the transition to Windows 2000.

The skeptics' assumption that the course of Windows 2000 adoption would repeat that of NT 3 and NT 4 is as wrong as their assumption that Windows 2000 will suffer from the same problems of stability and performance on a scale similar to those initially encountered with NT. Useful though it may be to learn from history, one must not assume simple repetition of the same scenario when critical factors are different.

Windows 2000 deployment will be much more rapid than that of Windows NT, in part for the reasons above. It will also, however, be more rapid because it expands substantially the scope for use of low-cost Wintel servers in the enterprise. Windows 2000's incremental capabilities put all but the very biggest applications within reach of increasingly powerful Intel server platforms at costs well below that of proprietary IBM mainframe-minicomputers or semiproprietary Unix servers from vendors like Sun. Much as these established vendors would like to convince us otherwise, the truth is that user organizations have a fast growing array of lower cost but viable Intel-architecture alternatives. Windows 2000 accelerates the growth of those alternatives.

Windows 2000 on Intel-architecture servers may not be the best choice for every possible use, but it will surely be a viable option for more uses than any of its predecessors. It will extend the reach of Wintel platforms even further into the mission critical space of industrial-strength applications for the enterprise. It will put more competitive pressure on traditional competitors and force them to lower prices simply to stay in the game. It will further reduce the territory of the older proprietary and semiproprietary platforms unique to individual system vendors. And it will, of course, benefit user organizations by bringing them more options at lower costs.

William F. Zachmann is vice president of Meta Group, an IT research company based in Stamford, Connecticut.

Do you have an opinion you would like to express? Let Senior Editor Megan Santosus know at santosus@cio.com.

Join the newsletter!

Error: Please check your email address.

More about IBM AustraliaIntelMeta GroupMicrosoftOraclePhoenixSun Microsystems

Show Comments