return to:


Moore's Law in Jeopardy
By Icarus Tull

WASHINGTON, DC--Nearly four decades ago, an obscure scientist named Gordon E. Moore postulated what has become known in technological circles as Moore's Law by predicting that the power of the silicon chip would double annually, with a proportionate decrease in cost. With this bit of insight, the godfather of the modern technological impulse effecively forecast, and has played a leading role in ushering in the computer revolution and its framer - the Information Age. This whiz kid prophet co-founded Intel Corporation in 1968, initially serving as Execeutive Vice-President until elected Chairman and Chief Execeutive Officer in 1979. Moore remained CEO until 1987, and is now Chairman Emeritus of Intel Corporation.

In an inconspicuous 1965 magazine article, Moore, then Fairchild Semiconductor's R&D director, reluctantly predicted the expected increase in the power of integrated circuits over 10 years. By the 1970s, Moore was a cofounder of Intel, and his tenuous "law" was well on its way to becoming a self-fulfilling prophecy among researchers, manufacturers, and vendors. Moore's Law, in sliding form, has governed Silicon Valley like an immutable force of nature. After modifying his idea that processing power will double every 18 months, his words were treated as an axiom - rather than the rule of thumb it actually was. No one knows this better than Gordon Moore.

In an interview in the May 1997 issue of Wired Magazine, Moore suggested that this rate of production power will go for at least a few more generations of technological innovation. Then, sometime near the end of the first decade of the 21st century we should expect a distinct slowing in the rate at which the doubling occurs, perhaps to three yearsrather than the current eighteen months.

So, is the end of our technological advance near? Moore suggests that chip innovators are running into a barrier that they've run up against several times before: the limits of optical lithography. Using UV waves to print the patterns of circuits, chip designers have reached a point where the wavelengths have moved into a range where one can't build effective lenses. The alternative of harnassing X rays to continue the process have yet to be properly explored.

Theoretically, X rays can keep chip designers on this production curve while in practical terms, there are still problems. Once moving away from optical lithography, scientists must rachet subsequent technique up to the same level of sophistication to keep making progress on the same track. X rays represent enough of a dramatic change that it will be difficult to build on what the laboratories have accomplished in the past. Starting over, it may take a considerable length of time in technology terms to establish traction.

The $200 billion industry that typically invests 10 percent of its revenues into research and development is obviously worried about this slowdown. A significant fraction of that investment will be aimed at solving this problem. Moore hopes that this research will uncover something to make this transition from light waves to xrays a smooth one. Unfortunately, there is another problem. The cost of manufacturing facilities doubles every generation. Intel has two plants that will cost more than $2.6 billion apiece.

The upside to this rising costs trend is that these superchips - imagine a billion transistors on a chip - could keep the industry focused, functional, and intellectually engaged for the next century. The most advanced chips in design today sport less than 10 million transistors. And while Moore admits that the industry wouldn't have the foggiest notion of what to do with a billion transistors in today's world, the future would advance its own needs and rewards.

Science fiction is still science fiction as far as Moore is concerned. His skepticism in areas which have long been the fodder of fantasy writers is stated with only a hint of second-guessing. A chemist by training, he has little faith that technologies such as DNA computing, organic semiconductors, quantum computing or the advent of computers built with nanotechnology will supersede microprocessors, although he admits that quantum devices could indeed be down the pike a few decades from now by musing, "Quantum devices may be the ultimate transistors. The transistor doesn't behave very well when you get down to very small dimensions, but that gets into the realm where things like quantum devices start working."

Moore earned a B.S. in Chemistry from the University of California at Berkeley and a Ph.D. in Chemistry and Physics from the California Institute of Technology. He was born in San Francisco, Calif., on Jan. 3, 1929. He is a director of Gilead Sciences Inc., a member of the National Academy of Engineering, and a Fellow of the IEEE. Moore also serves on the Board of Trustees of the California Institute of Technology. He received the National Medal of Technology from President George Bush in 1990.

Technological Impulse - The Scenewash Project 20003