EDITORIAL

Big data needs a hardware revolution

Artificial intelligence is driving the next wave of innovations in the semiconductor industry.
A worker poses next to a supercomputer

Software companies make headlines but research on computer hardware could bring bigger rewards.Credit: Morris MacMatzen/Getty

Advances in computing tend to focus on software: the flashy apps and programs that can track the health of people and ecosystems, analyse big data and beat human champions at Go. Meanwhile, efforts to introduce sweeping changes to the hardware that underlies all that innovation have gone relatively unnoticed.

Since the start of the year, the Semiconductor Research Corporation (SRC) — a consortium of companies, academia and government agencies that helps to shape the future of semiconductors — has announced six new university centres. Having watched the software giant Google expand into hardware research on artificial intelligence (AI), the main chip manufacturers are moving to reclaim the territory. As they do so, they are eyeing the start of a significant transformation — arguably the first major shift in architectures since the birth of computing.

This would be important to science: research in fields from astronomy and particle physics to neuroscience, genomics and drug discovery would like to use AI to analyse and find trends in huge sets of data. But this places new demands on traditional computer hardware. The conventional von Neumann architecture keeps data-storage units inside computers separate from data-processing units. Shuttling information back and forth between them takes time and power, and creates a bottleneck in performance.

To take advantage of AI technology, hardware engineers are looking to build computers that go beyond the constraints of von Neumann design. This would be a big step forward. For decades, advances in computing have been driven by scaling down the size of the components, guided by Gordon Moore’s prediction that the number of transistors on a chip roughly doubles every two years — which generally meant that processing power did the same.

Modern computers bear little resemblance to early machines that used punch cards to store information and mechanical relays to perform calculations. Integrated circuits now contain transistors so small that more than 100 million of them would fit on the head of a pin. Yet the fundamental design of separate memory and processing remains, and that places a limit on what can be achieved.

One solution could be to merge the memory and processing units, but performing computational tasks within a memory unit is a major technical challenge.

Google’s AlphaGo research shows a possible, different, way forward. The company has produced new hardware called a tensor processing unit, with an architecture that enables many more operations to be performed simultaneously. This approach to parallel processing significantly increases the speed and energy efficiency of computationally intensive calculations. And designs that relax the strict need to perform exact and error-free computation — a change in strategy known as approximate computing — could increase these benefits further.

As a result, the power consumption of AI programs such as AlphaGo has improved dramatically. But increasing the energy efficiency of such hardware is essential for AI to become widely accessible.

The human brain is the most energy-efficient processor around, so it is natural for hardware developers to try to mimic it. An approach called neuromorphic computing aims to do just that, with technologies that seek to simulate communication and processing in a biological nervous system. Several neuromorphic systems have already demonstrated the ability to emulate collections of neurons on tasks such as pattern recognition.

These are baby steps, and now the SRC has stepped in to try to encourage the hardware to walk. Under its Joint University Microelectronics Program, the SRC has quietly placed its focus on developing hardware architecture. A new centre at Purdue University in West Lafayette, Indiana, for example will research neuromorphic computing, and one at the University of Virginia in Charlottesville will develop ways of harnessing computer memory for extra processing power.

This technological task is huge. So it is heartening to see the SRC, traditionally US-centric, opening its doors. South Korean firm Samsung joined in late 2017, the fifth foreign company to sign up in the past two years. This is a welcome sign of collaboration. But that commercial rivals would work together in this way also signals how technically difficult the industry thinks it will be to develop new hardware systems.

As this research develops, Nature looks forward to covering progress and publishing results. We welcome papers that will enable computing architectures beyond von Neumann, such as components for neuromorphic chips and in-memory processing. Scientists across many fields are waiting for the result: computers powerful enough to sift all of their new-found data. They will have to wait a while yet. But the wait should be worth it.

Nature 554, 145-146 (2018)

doi: 10.1038/d41586-018-01683-1
Nature Briefing

Sign up for the daily Nature Briefing email newsletter

Stay up to date with what matters in science and why, handpicked from Nature and other publications worldwide.

Sign Up