Cover Story: IT: Then -- Now -- Tomorrow

While the 1900s saw the dawn of the technology revolution, the coming century is set to witness its zenith.

To mark the impending date change, Sue Bushell questions three top analysts about the "hits" that never quite lived up to expectations, the technological developments that lie ahead and the ones that changed the way we do business today.

In addition, Laura Mason talks to eight IT professionals, who boast a combined total of 235 years experience in the industry.

They reveal their perspectives on the changing fortunes of IT as a career and discuss the past, present and future of technology.

Hindsight and prediction: the analysts viewIs enterprise computing dead? Some 29 years after Intel launched the microprocessor that revolutionised business use of computers at least one industry pundit is convinced Y2K is enterprise computing's last hurrah.

From here on in, says Forrester's group director, research, John McCarthy, the game plan changes dramatically. Anything you do with IT within your organisation is going to prove pretty small beer compared to the effort you're likely to put into inter-enterprise computing in the future.

It's the fag end of the century, and just over a year before we commemorate the 30th anniversary of the day Intel introduced that first microprocessor (4-bit bus, 108KHz 4004 chip, 60,000 operations per second, $US200).

So it seems appropriate to reflect on the technology hits and misses of the last couple of decades, and to prophesise where IT is likely to be in another decade.

To help us out, we asked McCarthy, IDC managing director Chris Fell and Gartner Group research director Bruce McCabe to dig deep into their memories to award some brickbats and bouquets, then dust off their crystal balls to give us their predictions for the future.

Bouquets

Not surprisingly the birth of the microprocessor and the resulting PC platform figured high on our analysts' lists of the most significant technologies of the past few decades.

But equally crucial in Fell's eyes was the development of the GUI, (which emerged with the 1984 launch of the first Macintosh computer, the "Mac").

"This first phase set in motion the IT snowball," says Fell. "The improvements in efficiency and cost got the entrepreneurs thinking, 'what could we achieve if each individual worker had such a personal productivity tool at their fingertips?' Enter the PC, and the era of personal productivity."

Fell also nominates the invention of Ethernet in 1973 and the development of the client/server systems architecture as major advances, introducing the era of networking and allowing communication between workers within an organisation.

For the first time Ethernet let workers share knowledge and organisations leverage the efficiencies of individual workers.

"These technologies enabled the emergence of distributed computing within the enterprise," says Fell.

And, says McCarthy, don't neglect TCP/IP -- now the de facto networking protocol for just about anything you care to hook together.

"TCP/IP and all the protocols that sit on top of it, including HTTP and SMTP, have transformed the face of enterprise computing by giving organisations a common protocol that makes it as easy for them to go outside the organisation as inside," McCarthy says.

McCarthy also nominates e-mail, which has revolutionised the way we communicate in near real time.

Which brings us to the Internet, and HTML, perhaps the most significant development to date.

"The Internet is truly a revolution, not an evolution," says Fell. "It changes all the rules, not just within the enterprise, but at every level: between businesses and their suppliers, between business and their customers, between educational organisations and their students, between doctors and their patients, between government and the citizens they serve."

And the PalmPilot deserves an honourable mention too, says McCabe.

"The PalmPilot was a definite hit because it didn't try to pretend to be something that it wasn't, unlike most Windows CE devices which sort of look like a mini PC and disappointed everyone that bought them."

Brickbats

Was OS/2 or the Apple OS the most over-hyped operating system of the last few decades?

Without doubt it was OS/2, says McCarthy, who can find virtually nothing to praise in IBM's attempt to get the lead on Windows. McCarthy says OS/2 failed dismally because IBM tried to do too much, made it proprietary and failed to give people a simple migration path.

So why did Microsoft succeed? "Microsoft can spell 'device driver' and 'third-party application support'," says McCabe. "And Microsoft had a club that IBM never had. Microsoft had a set of applications that everybody who competed with them knew were going to run on Windows extremely well."

But Fell nominates the Apple OS and RISC hardware platform as rating higher on hype. He says Apple had better technology than Microsoft but missed becoming the industry standard because of "what might be considered one of the greatest fundamental business judgement errors in keeping the system proprietary instead of opening the system to outside development."

Also rating high on the hype meter was Larry Ellison's $500 network computer, which McCabe says was extraordinarily over-sold. If you listened to Ellison the network computer was going to "sweep the PC off every desktop" in the next wave of client/server computing.

"It's not only that there was a total failure to recognise that there is an enormous investment in technology, and no matter how good the next technology is, you can't just throw everything away and start again. There was also a total failure to appreciate IT managers' need to take an evolutionary rather than a revolutionary approach. That's why Windows-based terminals and things have taken off instead," McCabe said.

Other honourable mentions:

Token ring, which IBM didn't make cheap enough or ubiquitous enough and which failed because IBM "bought an old-world proprietary mindset to the game" (McCarthy).

Videoconferencing (Fell)

Newton, which Fell says was technology introduced ahead of the curve -- in fact only now are the requisite environmental conditions coming to fruition.

Forecasts

"The next 10 years are going to be like the last 20 years: smaller, faster, cheaper," says McCarthy.

"It's just going to take us to a scope and a size of market that's never been entertained before, in the same way PCs became so much bigger than the minicomputer and the mainframe business that people had problems grappling with that."

We're also reaching the end of the era enterprise computing, he says. "It's dead, it ain't the game, the game is inter-enterprise computing. I think Y2K is the last hurrah of enterprise computing and the game now is what are you doing outside of it, and what are you doing on new platforms."

Crystal balls are notoriously inaccurate when it comes to long-term predictions about the future of technology, but our three valiant pundits were prepared to give it their best shot.

2000

In the very near future, says McCabe, we'll see a "quite revolutionary" proliferation of handheld devices with wireless Internet access capabilities.

"There's really going to be a hell of a lot of non-PC related devices used to access sites," says McCabe, "and clearly that's a huge opportunity for businesses that get in early and leverage that."

Biometrics will create big developments in security, he says. Within months some notebook suppliers will be offering notebooks with built-in fingerprint readers, although widespread use might take a bit longer.

McCarthy agrees we'll see continued growth of non-PC platforms in the early years of the next century and says it's clear to him the whole PalmPilot phenomena is going to be one of the next big waves, "despite the rantings and ravings of Bill Gates and Scott McNealy".

2005

"I think by 2005 the PC has become kind of a server, in the same way that the mainframe has become a server -- the tail wagging the dog, no one is going to care. It's all going to be about PDAs, cellphones and their brethren," says McCarthy.

Portability will be the norm in the 21st century, agrees Fell. The growing desire for mobility by both enterprises and end users will spur the next evolution in the IT lifecycle.

"Enterprises will develop allowing the individual to not only access technology from everywhere (the Internet) but from anything. IT appliances from laptop PCs to pocket Internet devices will allow knowledge workers, entrepreneurs, and hobbyists alike to remain constantly connected to the never-ending flow of data, communication, and entertainment."

In the same way that wireless phones have untied phone users from fixed land lines, mobile data will allow any and all future devices to in some way connect to a truly "ethereal" network, Fell says.

In the first few years of the next millennium Fell expects to see smart phones and personal digital assistants begin intercepting bursts of data in the simplistic form of stock quotes, directions, sports scores, and e-mail. In time these devices will mature to the point where they'll allow graphical two-way communications between individuals -- further driving the exchange of information, ideas, and interpersonal interaction.

In the same time frame, we can expect smart cards to be used for security, says McCabe. It's likely that in many organisations users will plug in smart cards to access devices like Windows-based terminals or similar technologies which will allow them to interface with their files on the server.

But he says widespread use of smart cards by consumers is more likely to be a 10-year, rather than a five-year scenario.

And we will also start to enter the next paradigm shift: CyberSmart computing.

Encompassing both businesses and consumers, CyberSmart will involve natural interfaces, smart content, and the complete cross-environment use of computing technology, Fell says.

"Because CyberSmart computing uses natural interfaces, such as voice, it reaches out to the multitude of people not interested in or able to do, 'computing' in the traditional sense. CyberSmart Computing is about the transparent use of information technology.

"CyberSmart computing is also about smart content. This means information is transformed, analysed, forecasted, graphically represented, searched, assembled or otherwise manipulated to make it easier to digest.

"Microsoft's slogan is 'Where do you want to go today?' However, when you get there you are buried under a mass of raw data. The real question is What do you want to know today?"

To help address the problem we'll see vastly more intelligent search capabilities. There'll be massive advances in the intelligence of search engines and agents, but XML will also facilitate improvements in context within the infrastructure of the Web.

During the same time frame Fell believes key software technologies like Java and XML combined with massive databases and 'network intelligence' or 'Knowledge Management' centres will let software agents automate the generation, collation, and conveying of records from the massed array of databases available to the 21st century knowledge worker.

"For example, organisations will be able to revolutionise their relationships with their customers and clients using the technologies encapsulated in sophisticated customer relationship management tools."

We'll still be preoccupied with the Web, e-business and e-commerce over the next five years, says McCabe, but beyond that the picture is likely to be very different.

2010

People may be cynical about it now, but McCabe has no doubts that within 10 years voice recognition software, now in its infancy, will totally change the way we deal with technology.

It's not just that we'll use voice to interact with our operating systems. By 2010 voice technology will be ubiquitous.

Call centres will rely heavily on voice response technology and we'll use natural language interfaces to search the Web.

McCabe says the only barriers to improved voice recognition -- better algorithms and higher processing power -- are progressing exponentially.

He also predicts that by 2010 the ASP phenomenon will finally have matured enough for businesses to embrace it.

By then most organisations that have done their homework will have outsourced everything but their customer data and the mission-critical stuff that differentiates them as a business.

"And the whole phenomena of application service providers should be quite mature by then, and be widely adopted."

McCabe says business already understands the notion of software as a service, but it may take a decade before security, telecommunications reliability and software licensing issues are fully resolved.

"So there are lots of things to be overcome, and certainly by the 10-year scenario I'm comfortable that will be a standard part of doing business.

"Now a lot of people are hoping for it a lot earlier, but it will take time to mature."

Miniaturisation of CPUs and hard drives will mean Microsoft's dream of a PC in every home will come true, says Fell, but not necessarily in the way Gates envisaged.

Instead we'll see refrigerators, video games, TVs, phones, and the like all interacting with each other and with other devices within or outside the home or enterprise.

"The traditional building blocks of IT will proliferate. Processors, storage capacity, software and access to the Web will be built into a multitude of products.

"All manner of devices will be 'enabled', from your car to traditional white goods, from your wristwatch to your clothing."

During the same time frame McCarthy says we'll start to see the intersection of biotechnology and information technology in ways we can't even begin to conceptualise.

Tom Mulligan, general manager, technology services, Colonial GroupYears in IT: 35 The IT profession has shifted from being 'the high priests of technology' to an integral part of the business. Where we always had to write everything from scratch we can now buy component software off the shelf. We have changed from being code cutters to integrators.

[In the next century] I suspect that high technologists will be fewer in number and that the business end of the profession will be assimilated directly into the business. I see that we will eventually no longer have 'computer people' but rather most business people will use technology as a natural adjunct to their normal working day. Already I see that many of today's business people do not call on the IT department -- they just get on and solve their own problems. As we move forward I see this being a more rational and sustainable model.

The technologists will build and operate the network and server infrastructures and the business will do its own thing on top of that infrastructure.

Indeed I see that few companies will even build that infrastructure, but will outsource it.

You can see that happening already as companies rent systems to run on the Internet rather than buying them. There is no doubt in my mind that the Internet revolution and its accompanying tools will represent a critical transition for all businesses.

The quantum of change this set of technologies will bring is not yet totally clear but they will be massive in their impacts.

Robert J Marnell, chief information officer, Aristocrat Leisure IndustriesYears in IT: 30IT was a 'deep dark secret' when I started 30 years ago. The IT department was like the legal or medical departments -- it had specialists who talked a unique language and when they told users 'to do this job it will take us six months', blind trust was evoked; users had no idea of what it really took to get a job done.

Over the years users have gotten smarter -- most who have acquired university degrees over the past 10 to 15 years have had to become computer literate just to graduate. Given 'smart users' IT departments needed to change.

While they needed to be on top of technology, relationships with the user departments were just as important. This change has not gone well, given the number of IT departments being outsourced. Unless IT works on relationship management there will be no IT departments within major corporations.

Gary Jackson, Asia-Pacific director for Cisco's service provider businessYears in IT: 26If you go back to the early 70s there was hardly any computer science being taught -- computer science courses really evolved at the university level during the 70s.

Most of the people involved in IT back then were either Cobol programmers or hardware design people -- there wasn't the massive IT industry that there is now.

[IT] is genuinely a career now and a major career that's influencing all aspects of business.

I think the Internet has just been a phenomenal shot in the arm for the whole IT industry. It is now making information pervasive and therefore the IT industry much more recognised as one of the most critical industries of the 21st century.

Now senior executives clearly understand the impact, or at least are aware of the dramatic potential of the Internet and online systems and information.

They're far more aware of deliverables out of the IT environment and I think the IT professional has to be far more business oriented.

If you want to be a senior person in IT you really need to be able to communicate very effectively to CEOs, and boards, which are taking a lot more interest in what they are going to get out of all this.

Brian Finn, Ex-IBM Australia chief executive, currently chairman of multimedia company ImpartYears in IT: 41I feel IT people have become more professional and more willing to gain new and broader skills. However in my opinion, here in Australia, there's still not enough attention given to graduate and post graduate studies in a broad range of IT disciplines.

I think we'll see more demand for IT people with formal and practical experience in multiple disciplines. IT will continue to be a great career area, right at the forefront of change. I hope IT will become much more mainstream in terms of importance to organisations, clearly linked to improving business performance and shareholder value. In the past 20 years the Internet has been the most significant development.

In the short term I guess the most difficult problem for most businesses is to adapt to new opportunities for distribution fast enough to stay relevant but not so fast as to alienate existing distributors and partners.

Longer term, I suspect that it will be a real challenge for established companies to compete effectively with newcomers who can capitalise on the new technologies without having to re-engineer an established infrastructure.

Dr Peter Jones, interim general manager, Australian Centre for Advanced Computing and Communications at the Australian Technology ParkYears in IT: 44When I started my career IT had not been invented -- you had computer engineers, programmers, and the like, numbered in the hundreds.

The 50s and early 60s were periods of exciting pioneer computer design, experimenting with everything new -- from circuits, memory, peripherals, computer architecture, operating systems, computer languages to applications software.

A handheld calculator in the 50s was used to solve partial differential equations, then advanced to Silliac, Sydney's version of the University of Illinois Automatic Computer, Illiac, developed by John von Neumann.

[In terms of overhyped technologies over the past decades] it's hard to over hype technology when we've seen a million-fold performance improvement in the past 50 years, but perhaps the lack of simple, clean, elegant, easy-to-use interfaces has been the biggest problem.

Now IT rules our lives with a large number of job categories populated by a huge variety of hundreds of thousands of people.

Tom Forgan, CEO, Australian Technology ParkYears in IT: 10There will be an increase in demand for analog and digital skills combined. The ultimate word will be flexibility and interchangeability.

Advanced database and Web support will obviously be prime areas for career growth.

There could well be a significant role for IT counselling and consulting as the systems and equipment become more complex and people start to rely on them even more.

The next few years will bring advanced communications and the real integration of mobile and computing.

The use of WMTP mobile browser language will make the PC as we know it obsolete.

This, strangely, is being driven by the games market.

The new Playstation II (6.2GFLOPS) will be more powerful than the desktop and the mobile network access and high communications rate will be driven by the need for real-time interactive games.

The development of advanced Photonics networks and switching systems will move the PC/Computer into a new level.

I would hope that we could use these technologies to grow new startup IT&T companies in Australia.

Valda Berzins, chief information officer, Australia PostYears in IT: 10There's been a view that users would be able to program systems and run them and that we won't really need IT specialists.

Every few years someone comes up with a new program language or a further development in GUIs, and says this means you won't even need IT professionals and it will all be user done. That's never happened.

It will be very important for IT people to understand the business because if they are working towards a more competitive advantage, they've got to be able to understand what sort of things are important to the company and where the opportunities lie.

I am looking forward to using satellite and wireless communications because Australia is a pretty big country and there are a lot of isolated people. Australia Post certainly has to service remote areas. When those technologies become more advanced and cheaper it will bring a lot of scope not only for Australia Post but also for other organisations.

All industries are going to have to review their business models and look at process improvement or redesigning certain elements of their organisation because the Internet is going to change a lot of business.

Leon Daphne, managing director and CEO, Nissan Motor Co, AustraliaYears in IT: 39IT has changed from being essentially a profession with mainly technical orientation to a profession that is much more involved in the mainstream activity of the business.

I think there will always be a requirement for some technical orientation with IT professions, but there will be more and more concentration on business oriented activities using case tools and Internet support.

I started my career in IT in 1961 and worked initially on unit record equipment and then one of the first IBM 1401 computers. The key technology in those days was the replacement of punch cards with disk storage, the introduction of VDUs and the development of business-oriented programming languages such as Cobol. The concentration on 'structured programming' was the most hyped up technology that didn't develop to most people's satisfaction.

[Over the last 20 years, most significant developments have been] the development of the personal computer and user-oriented software and the most recent development of the Internet and the expansion of online real-time systems. I look forward to the full development of e-commerce through communication systems and full development of the Internet.

50 YEARS GONE BY

1951 John Mauchly and John Eckert build the Univac I, the first commercial electronic computer, which is installed at the US Census Bureau. Grace Murray Hopper develops A0, which translates programming code into binary code.

1953 IBM manufactures its model 650, the first mass-produced computer.

1955 Narinder Kapany develops the optical fibre.

1956 IBM develops the first hard drive, called Ramac. Programmers at IBM write the computer language Fortran.

1958 Texas Instruments builds the first integrated circuit. Bell Telephone introduces the first modems.

1959 Grace Murray Hopper and Charles Phillips invent Cobol.

1960 Digital Equipment Corporation develops the PDP-1, the first commercial computer equipped with a keyboard and monitor.

1964 The American Standard Association adopts ASCII as the standard code for data transfer.

1965 The simplified computer language BASIC is developed.

1968 Intel is formed.

1969 ARPAnet, precursor to the Internet, debuts. ATMs become more widely used in banks. "Bubble memory" makes its debut, allowing computers to retain memory after being shut off.

1970 The floppy disk is introduced. Intel develops the first memory chip, which stores 1024 bits of data. Bell Labs develops Unix.

1971 The first speech-recognition software, Hearsay, is developed in India.

1975 Bob Metcalfe at Xerox develops Ethernet. The US federal government's antitrust suit against IBM goes to trial; the government will drop the case in 1982, but not before producing some 30 million pages of documentation.

1976 Steve Wozniak and Steve Jobs form Apple Computer.

1977 Bill Gates and Paul Allen officially found Microsoft.

1978 Wordstar is released and quickly becomes the most popular word processing program.

1981 IBM introduces the IBM PC with an MS-DOS operating system.

1983 Apple introduces the $9995 Lisa, the first computer to use a graphical user interface and a mouse. IBM launches the PC-XT, the first computer with a built-in hard drive.

1984 CD-ROM debuts; Apple releases the Macintosh. 2400-baud modems are introduced.

1985 Microsoft develops Windows 1.0 for the IBM PC.

1986 Microsoft goes public. The National Science Foundation approves funding for the Internet backbone.

1988 Microsoft releases Windows 2.03, with overlapping windows that resemble the Macintosh's, and Apple files suit; six years and some $10 million later the court will decide in Microsoft's favour.

1989 Tim Berners-Lee invents the World Wide Web.

1993 Intel releases the Pentium chip. Marc Andreessen and Eric Bina design Mosaic, the first graphical Web browser.

1995 Microsoft introduces Windows 95 and Office 95. Jeffrey Bezos founds Amazon.com. Netscape goes public.

1998 E-commerce explodes as a new shopping medium with some 30 million households purchasing goods over the Internet.

1999 The Linux OS hits the big time. AOL completes its acquisition of Netscape. Microsoft, with 27,320 employees, reaches $14.48 billion in sales. US District Court Judge Thomas Penfield Jackson finds Microsoft operates as a monopoly and has used its power to hurt both consumers and competitors.

Join the newsletter!

Error: Please check your email address.

More about Amazon.comAOLApple ComputerAristocrat LeisureAristocrat LeisureAustralia PostBell LabsDigital Equipment CorporationGames MarketGartnerGartnerIBM AustraliaIBM AustraliaIDC AustraliaIntelIT PeopleMetcalfeMicrosoftNissan AustraliaNissan MotorPhotonicsPlaystationQuantumTexas Instruments AustraliaVoice TechnologyXerox

Show Comments

Market Place