Memristors, and other emerging memory technologies, can be used to create energy-efficient implementations of neural networks. However, for certain edge applications (in which there is access to limited amounts of data and where explainable decisions are required), neural networks may not provide an acceptable form of intelligence. Bayesian reasoning could resolve these concerns, but it is computationally expensive and—unlike neural networks—does not naturally translate to memristor-based architectures. Here we report a memristor-based Bayesian machine. The architecture of the machine is obtained by writing Bayes’ law in a way that makes its implementation natural by the principles of distributed memory and stochastic computing, allowing the circuit to function solely using local memory and minimal data movement. We fabricate a prototype circuit that incorporates 2,048 memristors and 30,080 transistors using a hybrid complementary metal–oxide–semiconductor/memristor process. We show that a scaled-up design of the machine is more energy efficient in a practical gesture recognition task than a standard implementation of Bayesian inference on a microcontroller unit. Our Bayesian machine also offers instant on/off operation and is resilient to single-event upsets.
This is a preview of subscription content, access via your institution
Access Nature and 54 other Nature Portfolio journals
Get Nature+, our best-value online-access subscription
$29.99 per month
cancel any time
Subscribe to this journal
Receive 12 digital issues and online access to articles
$119.00 per year
only $9.92 per issue
Rent or buy this article
Get just this article for as long as you need it
Prices may be subject to local taxes which are calculated during checkout
The analysed datasets and all data measured in this study are available from the corresponding author upon reasonable request.
The software programs used for modelling the Bayesian machine are available from the corresponding author upon reasonable request.
Editorial. Big data needs a hardware revolution. Nature 554, 145–146 (2018).
Zhang, W. et al. Neuro-inspired computing chips. Nat. Electron. 3, 371–382 (2020).
Sebastian, A., Le Gallo, M., Khaddam-Aljameh, R. & Eleftheriou, E. Memory devices and applications for in-memory computing. Nat. Nanotechnol. 15, 529–544 (2020).
Marković, D., Mizrahi, A., Querlioz, D. & Grollier, J. Physics for neuromorphic computing. Nat. Rev. Phys. 2, 499–510 (2020).
Pedram, A., Richardson, S., Horowitz, M., Galal, S. & Kvatinsky, S. Dark memory and accelerator-rich system optimization in the dark silicon era. IEEE Design & Test 34, 39–50 (2017).
Ambrogio, S. et al. Equivalent-accuracy accelerated neural-network training using analogue memory. Nature 558, 60–67 (2018).
Yu, S. Neuro-inspired computing with emerging nonvolatile memorys. Proc. IEEE 106, 260–285 (2018).
Prezioso, M. et al. Training and operation of an integrated neuromorphic network based on metal-oxide memristors. Nature 521, 61–64 (2015).
Ielmini, D. & Wong, H.-S. P. In-memory computing with resistive switching devices. Nat. Electron. 1, 333–343 (2018).
Wang, Z. et al. Fully memristive neural networks for pattern classification with unsupervised learning. Nat. Electron. 1, 137–145 (2018).
Xue, C.-X. et al. A CMOS-integrated compute-in-memory macro based on resistive random-access memory for AI edge devices. Nat. Electron. 4, 81–90 (2021).
Chen, D. et al. Deep learning and alternative learning strategies for retrospective real-world clinical data. npj Digit. Med. 2, 43 (2019).
Ghassemi, M. et al. A review of challenges and opportunities in machine learning for health. AMIA Jt Summits Transl. Sci. Proc. 2020, 191–200 (2020).
Rai, A. Explainable AI: from black box to glass box. J. Acad. Mark. Sci. 48, 137–141 (2020).
Ghahramani, Z. Probabilistic machine learning and artificial intelligence. Nature 521, 452–459 (2015).
Letham, B., Rudin, C., McCormick, T. H. & Madigan, D. Interpretable classifiers using rules and Bayesian analysis: building a better stroke prediction model. Ann. Appl. Stat. 9, 1350–1371 (2015).
Jaynes, E. T. Probability Theory: The Logic of Science (Cambridge Univ. Press, 2003).
Bessière, P., Mazer, E., Ahuactzin, J. M. & Mekhnacha, K. Bayesian Programming (CRC Press, 2013).
Van de Schoot, R. et al. A gentle introduction to Bayesian analysis: applications to developmental research. Child Dev. 85, 842–860 (2014).
Laurens, J. & Droulez, J. Bayesian processing of vestibular information. Biol. Cybern. 96, 389–404 (2007).
Lee, T. S. & Mumford, D. Hierarchical Bayesian inference in the visual cortex. J. Opt. Soc. Am. A 20, 1434–1448 (2003).
Maass, W. Noise as a resource for computation and learning in networks of spiking neurons. Proc. IEEE 102, 860–880 (2014).
Knill, D. C. & Pouget, A. The Bayesian brain: the role of uncertainty in neural coding and computation. Trends Neurosci. 27, 712–719 (2004).
Deneve, S. Bayesian spiking neurons I: inference. Neural Comput. 20, 91–117 (2008).
Houillon, A., Bessière, P. & Droulez, J. The probabilistic cell: implementation of a probabilistic inference by the biochemical mechanisms of phototransduction. Acta Biotheor. 58, 103–120 (2010).
Smith, R. J., Ashton, G., Vajpeyi, A. & Talbot, C. Massively parallel Bayesian inference for transient gravitational-wave astronomy. Mon. Notices Royal Astron. Soc. 498, 4492–4502 (2020).
Leech, C., Raykov, Y. P., Ozer, E. & Merrett, G. V. Real-time room occupancy estimation with Bayesian machine learning using a single PIR sensor and microcontroller. In 2017 IEEE Sensors Applications Symposium (SAS) 1–6 (IEEE, 2017).
Lei, X. & Wu, Y. Research on mechanical vibration monitoring based on wireless sensor network and sparse Bayes. Eurasip J. Wirel. Commun. Netw. 2020, 225 (2020).
Ferreira, J. F., Lobo, J. & Dias, J. Bayesian real-time perception algorithms on GPU. J. Real-Time Image Proc. 6, 171–186 (2011).
Zermani, S., Dezan, C., Chenini, H., Diguet, J.-P. & Euler, R. FPGA implementation of Bayesian network inference for an embedded diagnosis. In 2015 IEEE Conference on Prognostics and Health Management (PHM) 1–10 (IEEE, 2015).
Cai, R. et al. Vibnn: hardware acceleration of Bayesian neural networks. ACM SIGPLAN Not. 53, 476–488 (2018).
Liu, S., Mingas, G. & Bouganis, C.-S. An unbiased MCMC FPGA-based accelerator in the land of custom precision arithmetic. IEEE Trans. Comput. 66, 745–758 (2016).
Frisch, R. et al. A Bayesian stochastic machine for sound source localization. In 2017 IEEE International Conference on Rebooting Computing (ICRC) 1–8 (IEEE, 2017).
Ko, G. G. et al. A 3mm2 programmable Bayesian inference accelerator for unsupervised machine perception using parallel Gibbs sampling in 16nm. In 2020 IEEE Symposium on VLSI Circuits 1–2 (IEEE, 2020).
Faria, R., Camsari, K. Y. & Datta, S. Implementing Bayesian networks with embedded stochastic MRAM. AIP Adv. 8, 045101 (2018).
Friedman, J. S., Calvet, L. E., Bessière, P., Droulez, J. & Querlioz, D. Bayesian inference with Muller C-elements. IEEE Trans. Circuits Syst. I: Regul. Pap. 63, 895–904 (2016).
Vodenicarevic, D. et al. Low-energy truly random number generation with superparamagnetic tunnel junctions for unconventional computing. Phys. Rev. Appl. 8, 054045 (2017).
Dalgaty, T. et al. In situ learning using intrinsic memristor variability via Markov chain Monte Carlo sampling. Nat. Electron. 4, 151–161 (2021).
Gao, D. et al. Bayesian inference based robust computing on memristor crossbar. In 2021 58th ACM/IEEE Design Automation Conference (DAC) 121–126 (IEEE, 2021).
Gaines, B. R. Stochastic computing systems. in Advances in Information Systems Science 37–172 (Springer, 1969).
Alaghi, A. & Hayes, J. P. Survey of stochastic computing. ACM Trans. Embed. Comput. Syst. 12, 92 (2013).
Winstead, C. Tutorial on stochastic computing. in Stochastic Computing: Techniques and Applications 39–76 (Springer, 2019).
Chang, Y.-F. et al. eNVM RRAM reliability performance and modeling in 22FFL FinFET technology. In 2020 IEEE International Reliability Physics Symposium (IRPS) 1–4 (IEEE, 2020).
Gregori, S., Cabrini, A., Khouri, O. & Torelli, G. On-chip error correcting techniques for new-generation flash memories. Proc. IEEE 91, 602–616 (2003).
Hirtzlin, T. et al. Digital biologically plausible implementation of binarized neural networks with differential hafnium oxide resistive memory arrays. Front. Neurosci. 13, 1383 (2020).
Gupta, P. K. & Kumaresan, R. Binary multiplication with pn sequences. IEEE Trans. Acoust., Speech, Signal Process. 36, 603–606 (1988).
Warden, P. & Situnayake, D. Tinyml: Machine Learning with Tensorflow Lite on Arduino and Ultra-Low-Power Microcontrollers (O’Reilly Media, 2019).
Petzold, S. et al. Heavy ion radiation effects on hafnium oxide-based resistive random access memory. IEEE Trans. Nucl. Sci. 66, 1715–1718 (2019).
Borders, W. A. et al. Integer factorization using stochastic magnetic tunnel junctions. Nature 573, 390–393 (2019).
Li, C. et al. CMOS-integrated nanoscale memristive crossbars for CNN and optimization acceleration. In 2020 IEEE International Memory Workshop (IMW) 1–4 (IEEE, 2020).
Yao, P. et al. Fully hardware-implemented memristor convolutional neural network. Nature 577, 641–646 (2020).
Wan, W. et al. 33.1 A 74 TMACS/W CMOS-RRAM neurosynaptic core with dynamically reconfigurable dataflow and in-situ transposable weights for probabilistic graphical models. In 2020 IEEE International Solid-State Circuits Conference—(ISSCC), 498–500 (IEEE, 2020).
Jung, S. et al. A crossbar array of magnetoresistive memory devices for in-memory computing. Nature 601, 211–216 (2022).
Khaddam-Aljameh, R. et al. HERMES-core—a 1.59-TOPS/mm2 PCM on 14-nm CMOS in-memory compute core using 300-ps/LSB linearized CCO-based ADCs. IEEE J. Solid-State Circuits 57, 1027–1038 (2022).
This work was supported by the European Research Council starting grant NANOINFER (reference: 715872). We would like to thank A. Cherkaoui, M. Faix, R. Frisch, J. Grollier, L. Herrera-Diez, E. Mazer, A. Renaudineau, J. Simatic and S. Tiwari for discussion and invaluable feedback.
The authors declare no competing interests.
Peer review information
Nature Electronics thanks Hussam Amrouch, Justin Correll and Rajkumar Kubendran for their contribution to the peer review of this work.
Publisher’s note Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Extended Data Fig. 1 Detailed operation of the Bayesian machine.
a Schematic illustrating the detailed architecture of a likelihood elementary block. b Flowchart of the different operations to perform a Bayesian inference computation in the Bayesian machine. c Time diagram illustrating the operation of the Bayesian machine. This Figure is described in detail in Supplementary Note 4.
Rights and permissions
Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.
About this article
Cite this article
Harabi, KE., Hirtzlin, T., Turck, C. et al. A memristor-based Bayesian machine. Nat Electron 6, 52–63 (2023). https://doi.org/10.1038/s41928-022-00886-9
This article is cited by
A tiny âBayesian machineâ does much with little