Nature | Editorial

Translations

عربي

There's more to come from Moore

Moore's law is approaching physical limits: truly novel physics will be needed to extend it.

Article tools

Hail Gordon Moore: 19 April marks the famous prediction by the (less) famous man that the late twentieth century would herald massive increases in computing power, stimulating the technological age.

Electronics and information technology now touch almost every aspect of life. Kicking off with the invention of the integrated circuit in 1958, the continuing electronics revolution is, in large part, down to the technology industry’s faithful compliance with what came to be known as Moore’s law.

In 1965, Moore, a chemist turned electronic engineer, noticed that in the years since the first integrated circuits were built, engineers had managed to roughly double the number of components, such as transistors, on a chip every year. He also predicted that the rate of component shrinkage — which he later revised to a doubling every two years — would continue for at least another decade.

The semiconductor industry never looked back. It has continued to shrink transistors and produce computer chips that combine increasingly high performance and functionality.

For the first few decades, the semiconductor industry met Moore’s law mainly through feats of engineering genius and gigantic strides in manufacturing processes. But the key role of fundamental science is also worth remembering, especially as researchers today seek ways to maintain the rate of progress.

The invention of the transistor at Bell Laboratories in Murray Hill, New Jersey, in the 1940s was firmly based on the development of semiconductor band theory. And scientific breakthroughs played an important part in the subsequent developments of technology. Notably, in 1970, the Russian physicist Nikolay Basov and collaborators developed excimer lasers that would later be used to etch tiny circuit patterns on the silicon wafers from which chips are made.

The 1990s called for further innovation. Until then, as transistors became smaller, their speed and energy efficiency increased. But when the components reached around 100 nanometres across, miniaturization began to have the opposite effect, worsening performance. Chip-makers such as Intel, which Moore co-founded, and IBM again looked to basic science to improve the performance of transistor materials. Major help came from condensed-matter physicists. They had known for decades that the ability of silicon to conduct electricity improves substantially when its crystal lattice is stretched — for instance, by layering it on another crystal in which the atoms have a different spacing. Engineers introduced strained silicon into chips in the 2000s, and Moore’s law stayed true for several more years.

State-of-the-art microprocessors now have transistors that are just 14 nanometres wide, and Moore’s law is finally approaching the ultimate physical limits. Waste heat in particular has become a source of concern. It has already caused one form of Moore’s law — the exponential acceleration of computer ‘clock speed’ — to grind to a halt. Power-hungry chips also limit the ability of mobile devices to survive more than a few hours between charges.

The introduction of advanced materials such as hafnium oxide, which provides insulation even when it is just a few atomic layers thick, has managed to keep chips a bit cooler. Heroic efforts might yet bring one or two more generations of smaller transistors, down to a size of perhaps 5 nanometres. But further improvements in performance will require fundamentally new physics.

Where are we headed? Transistors that use quantum tunnelling, perhaps? Or those in which currents transport quantum spin rather than electric charge? Labs around the world are experimenting with approaches and materials that could drastically cut energy consumption. One avenue that could be exploited is the inherent stability of the collective ‘topological’ properties of atoms: a modern twist on the ancient practice of encoding information by tying knots. Some researchers are trying out radical ‘neuromorphic’ circuit architectures inspired by the plasticity of the brain’s neuronal networks.

A principle that works well in a physics lab will not necessarily translate into something that can be mass-produced. And inevitably, most of today’s attempts will lead nowhere. Society should have confidence, however, that somewhere, somehow, basic science will provide a way to maintain progress. Moore should be proud that we have not yet found the exception that proves his law.

Journal name:
Nature
Volume:
520,
Pages:
408
Date published:
()
DOI:
doi:10.1038/520408a

For the best commenting experience, please login or register as a user and agree to our Community Guidelines. You will be re-directed back to this page where you will see comments updating in real-time and have the ability to recommend comments to other users.

Comments

Commenting is currently unavailable.

sign up to Nature briefing

What matters in science — and why — free in your inbox every weekday.

Sign up

Listen

new-pod-red

Nature Podcast

Our award-winning show features highlights from the week's edition of Nature, interviews with the people behind the science, and in-depth commentary and analysis from journalists around the world.