Skip to main content

Thank you for visiting nature.com. You are using a browser version with limited support for CSS. To obtain the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles and JavaScript.

  • Comment
  • Published:

A new frontier for Hopfield networks

Over the past few years there has been a resurgence of interest in Hopfield networks of associative memory. Dmitry Krotov discusses recent theoretical advances and their broader impact in the context of energy-based neural architectures.

This is a preview of subscription content, access via your institution

Relevant articles

Open Access articles citing this article.

Access options

Buy this article

Prices may be subject to local taxes which are calculated during checkout

Fig. 1: The comparative energy landscapes of the traditional and modern Hopfield networks.

References

  1. Hopfield, J. J. Neural networks and physical systems with emergent collective computational abilities. PNAS 79, 2554–2558 (1982).

    Article  ADS  MathSciNet  MATH  Google Scholar 

  2. Hopfield, J. J. Neurons with graded response have collective computational properties like those of two-state neurons. PNAS 81, 3088–3092 (1984).

    Article  ADS  MATH  Google Scholar 

  3. Amit, D. J., Gutfreund, H. & Sompolinsky, H. Storing infinite numbers of patterns in a spin-glass model of neural networks. Phys. Rev. Lett. 55, 1530 (1985).

    Article  ADS  Google Scholar 

  4. Krotov, D. & Hopfield, J. J. Dense associative memory for pattern recognition. In Advances in Neural Information Processing Systems 29 (NIPS, 2016).

  5. Demircigil, M., Heusel, J., Löwe, M., Upgang, S. & Vermet, F. On a model of associative memory with huge storage capacity. J. Stat. Phys. 168, 288–299 (2017).

    Article  ADS  MathSciNet  MATH  Google Scholar 

  6. Ramsauer, H. et al. Hopfield networks is all you need. In International Conference on Learning Representations (ICLR, 2021).

  7. Vaswani, A. et al. Attention is all you need. In Advances in Neural Information Processing Systems 30 (NIPS, 2017).

  8. Hoover, B. et al. Energy transformer. Preprint at https://arxiv.org/abs/2302.07253 (2023).

  9. Krotov, D. & Hopfield, J. J. Large associative memory problem in neurobiology and machine learning. In International Conference on Learning Representations (ICLR, 2021).

  10. LeCun, Y., Chopra, S., Hadsell, R., Ranzato, M. & Huang, F. A tutorial on energy-based learning. In Predicting Structured Data (MIT Press, 2007).

Download references

Acknowledgements

I thank B. Hoover and J. Hopfield for numerous discussions over several years of the ideas described in this Comment.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Dmitry Krotov.

Ethics declarations

Competing interests

The author declares no competing interests.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Krotov, D. A new frontier for Hopfield networks. Nat Rev Phys 5, 366–367 (2023). https://doi.org/10.1038/s42254-023-00595-y

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1038/s42254-023-00595-y

This article is cited by

Search

Quick links

Nature Briefing AI and Robotics

Sign up for the Nature Briefing: AI and Robotics newsletter — what matters in AI and robotics research, free to your inbox weekly.

Get the most important science stories of the day, free in your inbox. Sign up for Nature Briefing: AI and Robotics