The future is more than Moore's was

Moore's Law is dead. Long live Moore's Law, which for most of us was the opening salvo in the intellectual exercise of relating society to technology.

Intel founder Gordon Moore's observation, made 40 years ago last month in Electronics magazine, originally dealt only with the density of semiconductors: "The complexity for minimum component costs has increased at a rate of roughly a factor of two per year . . . this rate can be expected to continue, if not to increase."

Later, the remark was tweaked and elevated to the status of a "law" as it gained a cult following among technologists trying to understand the impact of technology. Like an overworked protocol, it was extended to areas unimagined by its creator -- predicting the economics of technology and the next big thing in the Information Age.

History may recognize Moore's Law as the first crack at an algorithm to describe the accelerating progress of technology. Pre-Moore predictions were mainly the domain of futurists like HG Wells, who described technology as progressing on a linear scale. Wells and other futurists described a post-industrial world of fancier machines, exotic airships, death rays and magical communications replacing steam engines.

Gauging the future in Wells' turn-of-the-century time was largely science fiction. The rate of technology change was quickening in late Victorian times, yet it was slow by today's standards and could hardly be accurately observed, much less modelled or measured.

Moore brought the future into clearer focus with a useful, if misunderstood and somewhat erratic, lens for measuring technology's growth exponentially. Now, after 40 years, the future of Moore's Law ain't what it used to be. Moore's Law self-destructs when the economics of digital electronics reaches the point of declining returns.

For a fresh look at the future, meet Ray Kurzweil and the Law of Accelerating Returns. Kurzweil, best known for the digital keyboards (more properly called synthesizers) that bear his name, is one of the most amazing figures of the current epoch. His Law of Accelerating Returns, debuted in a 2001 essay, measures the computational power of machines back to the time of Wells.

Importantly, it frees the rate of technology change from any specific implementation, like semiconductors and transistors.

Through five technology paradigms, from electromechanical, through vacuum tubes to integrated circuits, Kurzweil finds the rate of technological change to be even more dramatic than Moore forecast. Computational power doubled every two years, Kurzweil found, from the late Steam Age through the Electromechanical Age, until 30 years ago. Then, with the dawn of the Digital Age, computational power began to double yearly.

"Technological change is exponential, contrary to the common-sense 'intuitive linear' view," Kurzweil wrote. "We won't experience 100 years of progress in the 21st century -- it will be more like 20,000 years of progress (at today's rate). . . . Within a few decades, machine intelligence will surpass human intelligence."

In the second decade of the 21st century, we will enter what Kurzweil labels the sixth paradigm of computing, which will harness three dimensions of computing using the human brain as a model. Kurzweil sees the constraints posed by the Law of Declining Returns being brushed aside by stages of evolution that provide more-powerful tools for each succeeding generation.

"The bulk of our experiences will shift from real reality to virtual reality," he wrote. "Most of the intelligence of our civilization will ultimately be nonbiological, which by the end of this century will be trillions of trillions of times more powerful than human intelligence."

This future, according to Kurzweil and others, leads to the Singularity, where societal, scientific and economic change is so fast we can't even imagine what will happen from our present perspective. According to Kurzweil, the Singularity represents a "rupture in the fabric of human history", the impact of which is impossible to know but will doubtless be profound for both man and machine.

Kurzweil didn't invent the Singularity, only a model for measuring our progress toward it. Singularity is generally credited to mathematician and author Vernor Vinge, who began expounding on the topic in the 1980s. He explained it in detail in his 1993 essay "Technological Singularity".

And just so you don't get too comfortable, beyond the Singularity could lie the Gray Goo, posited by Eric Dexler of the Foresight Institute, a nanotechnology think tank. The Gray Goo is a return to the seminal pool, this one formed by nanomachines that destroy mankind. Get your popcorn and settle in for this accelerated version of reality. It's just starting, and the future is more than it once was.

Mark Willoughby, CISSP, is a 20-year IT industry veteran and journalist

Join the newsletter!


Sign up to gain exclusive access to email subscriptions, event invitations, competitions, giveaways, and much more.

Membership is free, and your security and privacy remain protected. View our privacy policy before signing up.

Error: Please check your email address.

More about EpochHISIntelMan and Machine

Show Comments