Millennium myths and fixes

How much intelligence does it take to sneak up on a leaf?

-- Larry Niven

People get all excited about the darndest things.

I know otherwise normal people who froth at the mouth when they hear me say the millennium starts January 1, 2000. No, they insist angrily, it begins January 1, 2001. Don't I know anything?

Well yes, I do. I know it is more a matter of opinion than the hard-liners think. Why? Let's begin with a startling realisation: Decades and centuries don't line up!

Decades, named by their first year, number zero through nine, so the 1990s are named for the year 1990 and end on December 31, 1999. Very few people claim that the year 2000 is part of the 1990s.

Centuries number from 1 through 100. That makes sense -- this is, after all, the 20th century, so the year 2000 had better be a part of it.

The question of when the millennium begins, then, all boils down to this: does it begin with a new decade or a new century? I say it starts with the new decade, in 2000. You're free to wait until the new century begins, but I'm guessing that you'll miss an awesome party on December 31, 1999.

And you won't get to attend one the following year, because the world will, of course, end in the year 2000, destroyed by ubiquitous computer failures.

Just kidding. As it always does, the world will muddle through, saved by a mixture of planning, hard work, and improvisation.

Survival is quite an accomplishment for the working CIO. Surviving the year 2000 also will be quite an accomplishment.

Two big myths surround the year 2000 problem. The first is that it's a bug. The second is that it is a mess because, somehow, the end of the millennium snuck up on unwary CIOs all over the world.

Let's explore these myths now so you can focus on solving the problem instead of avoiding the blame.

The way we encode dates, or at least the way we used to encode dates, was an intelligent design decision back in the 1960s and 1970s, when in-house and commercial programmers wrote most of our legacy systems. Storage -- both RAM (we called it "core memory" back then) and disk -- cost lots of money, and the best programmers were those who could squeeze the most performance into the smallest computing footprint. Saving two bytes per date field made all kinds of business sense, and nobody figured these systems would have to last three decades or more.

They're still running, either because we failed at our grandiose replacement projects (I've seen several of these), or because there simply has been no compelling business reason to replace systems that work just fine.

That is, it really is a feature, not a bug, and it proves once again that no good deed ever goes unpunished. Here is who will be punished: you, for not starting to fix the problem several years ago. And it isn't entirely your fault.

I remember asking if we had any year 2000 problems in 1994, when just a few worriers first started to write about the subject. It didn't matter. We had a tight budget, we had just reduced staffing 10 per cent to help the company improve its short-term profitability, and we had the usual laundry list of urgent projects. The millennium would just have to wait a year or two until it became urgent.

Business has a short-term focus because Wall Street drives business strategy, and Wall Street insists on quarter-by-quarter earnings improvement. Fixing year 2000 software problems adds no new value, so until the problem reached crisis proportions last year, few companies bothered to spare any resources to fix it.

There's plenty of blame to spread around, but let's not.

Join the newsletter!

Error: Please check your email address.

More about Wall Street

Show Comments

Market Place