Here's our take on the 10 biggest IT happenings chronicled in our pages over the past 20 years.
The Internet goes commercial
The Internet scene in the mid-1980s was dominated by discussions of acceptable-use policies, through which government and academic users sought to restrict Internet access to, well, government and academic users. Unacceptable uses of the Internet, such as porn and spam, hadn't been thought of yet; in those days, "unacceptable" meant commercial. Today, billions of dollars in transactions flow through the Net every month.
Fortunately, Computerworld never ran a story with the headline, "Al Gore Invents Internet." The real inventors wrote a seminal report for the National Research Council in 1988 titled "Towards a National Research Network," which spurred the development of interconnecting high-speed networks and encouraged IT vendors to build TCP/IP into their products. In 1989, Tim Berners-Lee wrote a paper describing "a distributed hyper-text system," which would become the World Wide Web.
E-commerce became an obsession when the dot-com bubble started to inflate in 1997. Even after the bubble popped in 2000, however, corporate enthusiasm for the Internet hardly slowed. Today, some of the hottest ideas in computerdom -- Web services, VOIP, service-oriented architectures and utility computing -- are grounded in the Internet.
Monopoly musical chairs
IBM dominated computing until the late 1980s. But its 1981 release of the IBM PC and the acceptance of PC clones, which were packed with Microsoft's software, created a desktop computing market that changed the face of IT and put Microsoft at the center of power in the industry. Software developers flocked to DOS, and later Windows, to create thousands of applications, helping propel Microsoft's desktop operating system market share to more than 90 percent in the 1990s.
The government's concern about a Microsoft monopoly started with a 1991 investigation and culminated in 2000, when a federal district court judge found the company guilty of violating the U.S. Sherman Antitrust Act. Microsoft now faces threats from Linux, Google and Europe's antitrust regulators.
The Y2k 'problem'
Nowadays, when you're prompted to enter a date, you'll see something like "mm/dd/yyyy." Quite an innovation, that four-digit year.
The first printed mention of a Y2k Armageddon was made in Computerworld in 1984. In 1993, we printed Peter de Jager's estimate that Y2k repairs would cost US$100 billion. As hysteria mounted, cost estimates soared to close to $1 trillion.
On Jan. 2, 2000, the whole thing was seen as a bad dream and promptly forgotten. IBM said the average large company spent up to 400 man-years on the problem. Was that effort wasted? No - how else could we have justified scrapping those old Cobol systems?
The new foreign face of outsourcing
The practice of IT outsourcing stretches back to 1949 with ADP's mission to be the payroll service for the world. In 1962, Ross Perot started Electronic Data Systems to be a general-purpose IT outsourcing shop. And when Lou Gerstner took over IBM in 1993, his turnaround strategy was largely based on pushing IBM's outsourcing services. But outsourcing became a contentious labor and political issue early this century when U.S. corporations stepped up sending IT work offshore during an economic downturn. India's offshoring revenues in fiscal 2005 skyrocketed 34.5 percent to US$17.2 billion, with more than 1 million Indian IT workers serving overseas customers.
The rise of personal computing
"The concept of 'a PC on every desk' has gone from being a gleam in the bespectacled eyes of a young Bill Gates to a near campaign promise by H. Ross Perot," Computerworld wrote in 1992. But then we went on to suggest that the real impact of personal computing wouldn't be felt until well into the next millennium.
The proliferation of the desktop computer happened much faster than that, of course, as did the rise of personal computing -- via laptops, PDAs, cell phones and other devices. Personal computing is just one dimension of the epochal movement of computing away from centralized mainframes to client/server computing, multitier distributed computing, grids and more.
But unlike the emergence of the minicomputer and the server, the rise of the PC had special meaning for IT managers: It meant they were no longer in control. That Lotus 1-2-3 spreadsheet user was programming, whether the IT shop liked it or not.