Letter | Published:

Neural network computation with DNA strand displacement cascades

Nature volume 475, pages 368372 (21 July 2011) | Download Citation

Abstract

The impressive capabilities of the mammalian brain—ranging from perception, pattern recognition and memory formation to decision making and motor activity control—have inspired their re-creation in a wide range of artificial intelligence systems for applications such as face recognition, anomaly detection, medical diagnosis and robotic vehicle control1. Yet before neuron-based brains evolved, complex biomolecular circuits provided individual cells with the ‘intelligent’ behaviour required for survival2. However, the study of how molecules can ‘think’ has not produced an equal variety of computational models and applications of artificial chemical systems. Although biomolecular systems have been hypothesized to carry out neural-network-like computations in vivo3,2,4 and the synthesis of artificial chemical analogues has been proposed theoretically5,6,7,8,9, experimental work10,11,12,13 has so far fallen short of fully implementing even a single neuron. Here, building on the richness of DNA computing14 and strand displacement circuitry15, we show how molecular systems can exhibit autonomous brain-like behaviours. Using a simple DNA gate architecture16 that allows experimental scale-up of multilayer digital circuits17, we systematically transform arbitrary linear threshold circuits18 (an artificial neural network model) into DNA strand displacement cascades that function as small neural networks. Our approach even allows us to implement a Hopfield associative memory19 with four fully connected artificial neurons that, after training in silico, remembers four single-stranded DNA patterns and recalls the most similar one when presented with an incomplete pattern. Our results suggest that DNA strand displacement cascades could be used to endow autonomous chemical systems with the capability of recognizing patterns of molecular events, making decisions and responding to the environment.

Access optionsAccess options

Rent or Buy article

Get time limited or full article access on ReadCube.

from$8.99

All prices are NET prices.

References

  1. 1.

    Neural Networks: A Systematic Introduction (Springer, 1996)

  2. 2.

    Protein molecules as computational elements in living cells. Nature 376, 307–312 (1995)

  3. 3.

    , & A connectionist model of development. J. Theor. Biol. 152, 429–453 (1991)

  4. 4.

    , & On schemes of combinatorial transcription logic. Proc. Natl Acad. Sci. USA 100, 5136–5141 (2003)

  5. 5.

    , & Chemical implementation of neural networks and Turing machines. Proc. Natl Acad. Sci. USA 88, 10983–10987 (1991)

  6. 6.

    Building an associative memory vastly larger than the brain. Science 268, 583–585 (1995)

  7. 7.

    , & Article for analog vector algebra computation. Biosystems 52, 175–180 (1999)

  8. 8.

    , & in Advances in Neural Information Processing Systems Vol. 17 (eds , & ) 681–688 (MIT Press, 2004)

  9. 9.

    & in DNA Computing and Molecular Programming (eds & ) 176–186 (Lecture Notes in Computer Science, Vol. 6518, Springer, 2011)

  10. 10.

    , , & Experiments on pattern recognition by chemical kinetics. J. Phys. Chem. 99, 10063–10065 (1995)

  11. 11.

    , , , & Experimental aspects of DNA neural network computation. Soft Comput. 5, 10–18 (2001)

  12. 12.

    et al. In vitro molecular pattern classification via DNA-based weighted-sum operation. Biosystems 100, 1–7 (2010)

  13. 13.

    & Synthetic in vitro transcriptional oscillators. Mol. Syst. Biol. 7, 465 (2011)

  14. 14.

    Molecular computation of solutions to combinatorial problems. Science 266, 1021–1024 (1994)

  15. 15.

    & Dynamic DNA nanotechnology using strand-displacement reactions. Nature Chem. 3, 103–113 (2011)

  16. 16.

    & A simple DNA gate motif for synthesizing large-scale circuits. J. R. Soc. Interface 10.1098/rsif.2010.0729 (published online 4 February 2011)

  17. 17.

    & Scaling up digital circuit computation with DNA strand displacement cascades. Science 332, 1196–1201 (2011)

  18. 18.

    Threshold Logic and its Applications Vol. 18 (Wiley-Interscience, 1971).

  19. 19.

    Neural networks and physical systems with emergent collective computational abilities. Proc. Natl Acad. Sci. USA 79, 2554–2558 (1982)

  20. 20.

    & A logical calculus of the ideas immanent in nervous activity. Bull. Math. Biol. 5, 115–133 (1943)

  21. 21.

    The realization of symmetric switching functions with linear-input logical elements. IRE Trans. Electron. Comput. EC-10, 371–378 (1961)

  22. 22.

    The complexity of the parity function in unbounded fan-in, unbounded depth circuits. Theor. Comput. Sci. 85, 155–170 (1991)

  23. 23.

    , , , & Threshold circuits of bounded depth. J. Comput. Syst. Sci. 46, 129–154 (1993)

  24. 24.

    in Symp. on the Application of Switching Theory (eds & ) 289–297 (Stanford Univ. Press, 1963)

  25. 25.

    , , & Deoxyribozyme-based three-input logic gates and construction of a molecular full adder. Biochemistry 45, 1194–1199 (2006)

  26. 26.

    Characteristics of sparsely encoded associative memory. Neural Netw. 2, 451–457 (1989)

  27. 27.

    , & DNA as a universal substrate for chemical kinetics. Proc. Natl Acad. Sci. USA 107, 5393–5398 (2010)

  28. 28.

    et al. Molecular circuits for associative learning in single-celled organisms. J. R. Soc. Interface 6, 463–469 (2009)

  29. 29.

    , , , & Training a molecular automaton to play a game. Nature Nanotechnol. 5, 773–777 (2010)

  30. 30.

    et al. MicroRNAs accurately identify cancer tissue origin. Nature Biotechnol. 26, 462–469 (2008)

  31. 31.

    Towards biomedical applications for nucleic acid nanodevices. Nanomedicine 2, 817–830 (2007)

Download references

Acknowledgements

We thank P. Rothemund, P. Yin, D. Woods, D. Soloveichik and N. Dabby for comments on the manuscript. We also thank R. Murray for the use of experimental facilities. This work was supported by the NSF (grant nos 0728703 and 0832824 (The Molecular Programming Project)) and by HFSP award no. RGY0074/2006-C.

Author information

Affiliations

  1. Bioengineering, California Institute of Technology, Pasadena, California 91125, USA

    • Lulu Qian
    •  & Erik Winfree
  2. Computer Science, California Institute of Technology, Pasadena, California 91125, USA

    • Erik Winfree
  3. Computation and Neural Systems, California Institute of Technology, Pasadena, California 91125, USA

    • Erik Winfree
    •  & Jehoshua Bruck
  4. Electrical Engineering, California Institute of Technology, Pasadena, California 91125, USA

    • Jehoshua Bruck

Authors

  1. Search for Lulu Qian in:

  2. Search for Erik Winfree in:

  3. Search for Jehoshua Bruck in:

Contributions

L.Q. designed the system, performed the experiments and analysed the data; L.Q. and E.W. performed the in silico training and wrote the manuscript; E.W. guided the project and discussed the design and the data; and J.B. initiated and guided the project, and discussed the manuscript.

Competing interests

The authors declare no competing financial interests.

Corresponding author

Correspondence to Erik Winfree.

Supplementary information

PDF files

  1. 1.

    Supplementary Information

    This file contains Supplementary Figure 1-28 with legends, Supplementary Methods, Supplementary Text and Data comprising: 1 The seesaw DNA gate motif; 2 Four types of seesaw gates; 3 Four transformation rules; 4 Circuit design lessons learned from experiments; 5 In silico training of dual-rail monotone Hopfield associative memories; 6 A four-neuron dual-rail monotone Hopfield associative memory; 7 Modeling and simulations; 8 Sequences, Supplementary Tables 1-7 and additional references.

About this article

Publication history

Received

Accepted

Published

DOI

https://doi.org/10.1038/nature10262

Further reading

Comments

By submitting a comment you agree to abide by our Terms and Community Guidelines. If you find something abusive or that does not comply with our terms or guidelines please flag it as inappropriate.