2018 IEEE Int. Electron Devices Meet. (in the press)

Brain-like computation using artificial neural networks (ANNs) could provide improved energy efficiency when compared to conventional computing techniques, which need to shuffle data between memory and processing units. Although the accuracy and performance of software-based ANNs have continued to show improvement, their power efficiency remains limited, as they are still run on conventional hardware. Physical hardware whose architecture mimics that of the brain will be needed if brain-like efficiencies are to be achieved. And underlying this is the need for electronic devices that can efficiently and accurately mimic the behaviours of neurons and synapses. Asif Khan and colleagues have now developed a spiking neuron structure that uses a ferroelectric field-effect transistor (FEFET) and a conventional field-effect transistor.

The researchers — who are based at Georgia Institute of Technology, the University of Notre Dame, the University of California, Berkeley and Lawrence Berkeley National Laboratory — use a lead zirconate titanate-based ferroelectric gate in their FEFET, whose tunable hysteretic properties allow their spiking neuron to demonstrate both excitatory and inhibitory functions. The experimental results were used to simulate a spiking neural network comprised of several hundred input, excitatory and inhibitory neurons for use in unsupervised learning tasks. Based on a projected 45 nm node implementation, the network used 3.6 nW of power during image classification, which is better than conventional approaches and other emerging device technologies.