When Gordon Moore wrote his speculative article 'Cramming more components onto integrated circuits', 40 years ago this week, the laptop was still very far away.

Half-way through the piece lies a cartoon that says it all: at a trade fair, a salesman with a cheesy grin offers punters a box the size of a book, from a stall offering Handy Home Computers. Clearly, Moore's assertion that "integrated circuits will lead to such wonders as home computers" was regarded by the magazine as something of a joke.

And perhaps that's not surprising. Moore confesses that the only user he could imagine for such a thing was "the housewife putting her recipes on it".

Gordon Moore coined "Moore's Law": that the number of components on a computer circuit doubles every year or two. Credit: © AP Photo/Paul Sakuma

But the idea of a home computer was no more outrageous than the article's punch line. Moore suggested that by 1975 it might be possible to cram 65,000 electronic components on to a single silicon chip just a quarter of a square inch in size.

Given that, in 1965, integrated circuits with 50 components represented state-of-the-art chip complexity, this claim was bold verging on reckless. But Moore was right. And incredibly, he has gone on being right for four decades.

He pointed out that, since their invention, the complexity of silicon integrated circuits, had doubled every year: an exponential rate of growth. Moore's Law is now usually cited as saying that the density of chip components, a measure of 'complexity', doubles every 18 months.

What this means 40 years on is that the complexity of silicon chips has reached astonishing proportions: a modern chip may contain half a billion transistors.

Back for Moore

Spotting this trend so early in the game was remarkable. In 1965, Gordon Moore was at the centre of the nascent semiconductor industry. He was the 36-year-old director of research and development at the Fairchild Semiconductor Corporation, where Robert Noyce devised the integrated circuit in 1959. Moore says that not everyone at that time saw a need for integrated circuits: just give us the parts, some electronic engineers said, and let us build our own circuits.

Before joining Fairchild, Moore had worked at the company set up by one of the inventors of the transistor, William Shockley. In 1968, Moore and Noyce quit Fairchild to set up Intel (they decided not to call the company Moore Noyce, because in electronics 'more noise' is not a good thing). Intel produced the first microprocessor chip, the Intel 4004, in 1971, and from there Moore's Law traces a line direct to your Bluetooth-linked, slimline laptop.

Increasing the device density of a circuit of fixed size means that the components must get smaller. And miniaturization has been made possible by adapting etching technologies to carve out ever smaller structures in silicon.

"By making things smaller," said Moore, "everything gets better simultaneously." The speed of computers goes up, the power consumption per component goes down, the systems get more reliable, and the cost of doing a job electronically drops so dramatically that a Gameboy now has more computer power than was carried aboard the Apollo 11 Moon mission.

Originally proposed as an empirical observation, Moore's Law is regarded by some as a self-fulfilling prophecy: to fail to sustain it would be to admit defeat. But it is probably fairer to say that the 'law' is driven by fierce competition in the industry.

Some have argued that industry now has an unhealthy obsession with the law, focusing on ever-higher chip density no matter what the cost of development. Google chief Eric Schmidt raised eyebrows three years ago when he announced that his company had no intention of paying the high premium for Intel's latest Itanium microprocessor, and would instead build its servers from cheaper chips. More power isn't the answer to everything.

Going up?

Moore didn't suggest how long the exponential growth he identified would last; in fact, he didn't look far beyond the 1970s. Several times over the past decade or so, it has been predicted that the end is in sight, but the semiconductor industry just keeps going.

The current state of play is represented by 90nm technology, which (despite its name) produces transistors just 50 nanometres (50 millionths of a millimetre) across. The next-generation chips, due this year, will use 65nm technology, and by 2010 Intel expects to be working at scales of 15 nanometres. Silicon devices this small have already been shown to be functional.

But there has got to be some limit to miniaturization. Even Moore admitted, "It can't continue forever." If his law is extrapolated to the middle of the twenty-first century, it says that a transistor will be as small as a single atom.

Even before then, chips will get so small that the insulating films will be too thin to prevent short circuits. And the heat generated by electrical currents in dense circuitry threatens it with meltdown.

Atomic theories

So what can be done? Some believe that the miniaturization trend will be sustained by building circuitry from single molecules, although no commercially feasible prototype has yet been demonstrated. Others say that the laws of quantum mechanics will be harnessed to allow huge numbers of computations to be conducted in parallel in 'quantum computers', but this is a technically challenging and practically remote option.

Strategies for the short term involve fundamentally new transistor designs, or circuits with devices that can link themselves up differently for each computational task. These represent completely different ways of doing computing, so in a way it is meaningless to try to fit them on to the graph of Moore's Law.

It's still anyone's guess as to how the chip will be reinvented. But if, in the meantime, you have a copy of the 19 April 1965 issue of the now defunct Electronics magazine in your attic, you might want to get in touch with Intel. Last week, they were offering to buy one for US$10,000.