Skip to main content

Thank you for visiting nature.com. You are using a browser version with limited support for CSS. To obtain the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles and JavaScript.

Learning function from structure in neuromorphic networks

A preprint version of the article is available at bioRxiv.

Abstract

The connection patterns of neural circuits in the brain form a complex network. Collective signalling within the network manifests as patterned neural activity and is thought to support human cognition and adaptive behaviour. Recent technological advances permit macroscale reconstructions of biological brain networks. These maps, termed connectomes, display multiple non-random architectural features, including heavy-tailed degree distributions, segregated communities and a densely interconnected core. Yet, how computation and functional specialization emerge from network architecture remains unknown. Here we reconstruct human brain connectomes using in vivo diffusion-weighted imaging and use reservoir computing to implement connectomes as artificial neural networks. We then train these neuromorphic networks to learn a memory-encoding task. We show that biologically realistic neural architectures perform best when they display critical dynamics. We find that performance is driven by network topology and that the modular organization of intrinsic networks is computationally relevant. We observe a prominent interaction between network structure and dynamics throughout, such that the same underlying architecture can support a wide range of memory capacity values as well as different functions (encoding or decoding), depending on the dynamical regime the network is in. This work opens new opportunities to discover how the network organization of the brain optimizes cognitive capacity.

Your institute does not have access to this article

Relevant articles

Open Access articles citing this article.

Access options

Buy article

Get time limited or full article access on ReadCube.

$32.00

All prices are NET prices.

Fig. 1: Measuring the memory capacity of biological neural networks.
Fig. 2: Memory capacity of the human connectome.
Fig. 3: Computation versus wiring cost trade-off.
Fig. 4: Memory capacity of intrinsic networks.
Fig. 5: Encoding and decoding capacity of intrinsic networks.
Fig. 6: Contribution of network topology to memory capacity is a function of the dynamics.

Data availability

The source dataset from University of Lausanne is available at https://doi.org/10.5281/zenodo.2872624. To facilitate the reproducibility of this work, the data ready to run the code is publicly available on Zenodo (https://doi.org/10.5281/zenodo.4776453).

Code availability

All code used for data processing, simulation, analysis, and figure generation is publicly available on GitHub and Zenodo (https://github.com/netneurolab/suarez_neuromorphicnetworks; https://doi.org/10.5281/zenodo.4776829)172, and is built on top of the following open-source Python packages: reservoir (https://github.com/estefanysuarez/reservoir; https://doi.org/10.5281/zenodo.4913398), Netneurotools (https://github.com/netneurolab/netneurotools), Numpy173,174,175, Scipy176, Pandas177, Scikit-learn178, bctpy (https://github.com/aestrivex/bctpy)89, NetworkX 179, Matplotlib180 and Seaborn181.

References

  1. Insel, T. R., Landis, S. C. & Collins, F. S. The NIH brain initiative. Science 340, 687–688 (2013).

    Article  Google Scholar 

  2. Van den Heuvel, M. P., Bullmore, E. T. & Sporns, O. Comparative connectomics. Trends Cogn. Sci. 20, 345–361 (2016).

    Article  Google Scholar 

  3. Sporns, O., Tononi, G. & Kötter, R. The human connectome: a structural description of the human brain. PLoS Comput. Biol. 1, e42 (2005).

    Article  Google Scholar 

  4. Bassett, D. S. & Sporns, O. Network neuroscience. Nat. Neurosci. 20, 353–364 (2017).

    Article  Google Scholar 

  5. Watts, D. J. & Strogatz, S. H. Collective dynamics of ‘small-world’ networks. Nature 393, 440–442 (1998).

    MATH  Article  Google Scholar 

  6. Sporns, O. & Zwi, J. D. The small world of the cerebral cortex. Neuroinformatics 2, 145–162 (2004).

    Article  Google Scholar 

  7. Bassett, D. S. & Bullmore, E. Small-world brain networks. Neuroscientist 12, 512–523 (2006).

    Article  Google Scholar 

  8. Kaiser, M. & Hilgetag, C. C. Nonoptimal component placement, but short processing paths, due to long-distance projections in neural systems. PLoS Comput. Biol. 2, e95 (2006).

    Article  Google Scholar 

  9. Chen, Z. J., He, Y., Rosa-Neto, P., Germann, J. & Evans, A. C. Revealing modular architecture of human brain structural networks by using cortical thickness from MRI. Cereb. Cortex 18, 2374–2381 (2008).

    Article  Google Scholar 

  10. Betzel, R. F. et al. The modular organization of human anatomical brain networks: accounting for the cost of wiring. Netw. Neurosci. 1, 42–68 (2017).

    Article  Google Scholar 

  11. Bertolero, M. A., Yeo, B. T. & D’Esposito, M. The modular and integrative functional architecture of the human brain. Proc. Natl Acad. Sci. USA 112, E6798–E6807 (2015).

    Article  Google Scholar 

  12. Hilgetag, C. C. & Kaiser, M. Clustered organization of cortical connectivity. Neuroinformatics 2, 353–360 (2004).

    Article  Google Scholar 

  13. Power, J. D. et al. Functional network organization of the human brain. Neuron 72, 665–678 (2011).

    Article  Google Scholar 

  14. Liu, Z.-Q., Zheng, Y.-Q. & Misic, B. Network topology of the marmoset connectome. Netw. Neurosci. 4, 1181–1196 (2020).

    Article  Google Scholar 

  15. Hagmann, P. et al. Mapping the structural core of human cerebral cortex. PLoS Biol. 6, e159 (2008).

    Article  Google Scholar 

  16. Sporns, O., Honey, C. J. & Kötter, R. Identification and classification of hubs in brain networks. PloS ONE 2, e1049 (2007).

    Article  Google Scholar 

  17. Zamora-López, G., Zhou, C. & Kurths, J. Cortical hubs form a module for multisensory integration on top of the hierarchy of cortical networks. Front. Neuroinform. 4, 1 (2010).

    Google Scholar 

  18. van den Heuvel, M. P., Kahn, R. S., Goñi, J. & Sporns, O. High-cost, high-capacity backbone for global brain communication. Proc. Natl Acad. Sci. USA 109, 11372–11377 (2012).

    Article  Google Scholar 

  19. Towlson, E. K., Vértes, P. E., Ahnert, S. E., Schafer, W. R. & Bullmore, E. T. The rich club of the C. elegans neuronal connectome. J Neurosci. 33, 6380–6387 (2013).

    Article  Google Scholar 

  20. Bullmore, E. & Sporns, O. The economy of brain network organization. Nat. Rev. Neurosci. 13, 336–349 (2012).

    Article  Google Scholar 

  21. Uddin, L. Q. Bring the noise: reconceptualizing spontaneous neural activity. Trends Cogn. Sci. 24, 734–746 (2020).

    Article  Google Scholar 

  22. Suárez, L. E., Markello, R. D., Betzel, R. F. & Misic, B. Linking structure and function in macroscale brain networks. Trends Cogn. Sci. 24, 302–315 (2020).

    Article  Google Scholar 

  23. Shafiei, G. et al. Topographic gradients of intrinsic dynamics across neocortex. eLife 9, e62116 (2020).

    Article  Google Scholar 

  24. Breakspear, M. Dynamic models of large-scale brain activity. Nat. Neurosci. 20, 340–352 (2017).

    Article  Google Scholar 

  25. Mišić, B. & Sporns, O. From regions to connections and networks: new bridges between brain and behavior. Curr. Opin. Neurobiol. 40, 1–7 (2016).

    Article  Google Scholar 

  26. Seguin, C., Tian, Y. & Zalesky, A. Network communication models improve the behavioral and functional predictive utility of the human structural connectome. Netw. Neurosci. 4, 980–1006 (2020).

    Article  Google Scholar 

  27. Melozzi, F. et al. Individual structural features constrain the functional connectome. Proc. Natl Acad. Sci. USA 116, 26961–26969 (2019).

    Article  Google Scholar 

  28. Ju, H. & Bassett, D. S. Dynamic representations in networked neural systems. Nat. Neurosci. 23, 908–917 (2020).

    Article  Google Scholar 

  29. Richards, B. A. et al. A deep learning framework for neuroscience. Nat. Neurosci. 22, 1761–1770 (2019).

    Article  Google Scholar 

  30. Lukoševičius, M. & Jaeger, H. Reservoir computing approaches to recurrent neural network training. Comput. Sci. Rev. 3, 127–149 (2009).

    MATH  Article  Google Scholar 

  31. Jaeger, H. The ‘Echo State’ Approach to Analysing and Training Recurrent Neural Networks—With an Erratum Note GMD Technical Report (German National Research Center for Information Technology, 2001).

  32. Maass, W., Natschläger, T. & Markram, H. Real-time computing without stable states: a new framework for neural computation based on perturbations. Neural Comput. 14, 2531–2560 (2002).

    MATH  Article  Google Scholar 

  33. Verstraeten, D., Schrauwen, B. & Stroobandt, D. Reservoir-based techniques for speech recognition. In 2006 IEEE International Joint Conference on Neural Network Proceedings 1050–1053 (IEEE, 2006).

  34. Verstraeten, D., Schrauwen, B., Stroobandt, D. & Van Campenhout, J. Isolated word recognition with the liquid state machine: a case study. Inf. Process. Lett. 95, 521–528 (2005).

    MATH  Article  Google Scholar 

  35. Skowronski, M. D. & Harris, J. G. Automatic speech recognition using a predictive echo state network classifier. Neural Netw. 20, 414–423 (2007).

    MATH  Article  Google Scholar 

  36. Salmen, M. & Ploger, P. G. Echo state networks used for motor control. In Proc. 2005 IEEE International Conference on Robotics and Automation 1953–1958 (IEEE, 2005).

  37. Ijspeert, A. J. Central pattern generators for locomotion control in animals and robots: a review. Neural Netw. 21, 642–653 (2008).

    Article  Google Scholar 

  38. Li, J. & Jaeger, H. Minimal Energy Control of an ESN Pattern Generator ICT-248311 (AMARSi, 2011).

  39. Tong, M. H., Bickett, A. D., Christiansen, E. M. & Cottrell, G. W. Learning grammatical structure with echo state networks. Neural Netw. 20, 424–432 (2007).

    MATH  Article  Google Scholar 

  40. Homma, Y. & Hagiwara, M. An echo state network with working memories for probabilistic language modeling. In International Conference on Artificial Neural Networks 595–602 (Springer, 2013).

  41. Pascanu, R. & Jaeger, H. A neurodynamical model for working memory. Neural Netw. 24, 199–207 (2011).

    Article  Google Scholar 

  42. Strock, A., Rougier, N. P. & Hinaut, X. A simple reservoir model of working memory with real values. In 2018 International Joint Conference on Neural Networks (IJCNN) 1–8 (IEEE, 2018).

  43. Sussillo, D. & Abbott, L. F. Generating coherent patterns of activity from chaotic neural networks. Neuron 63, 544–557 (2009).

    Article  Google Scholar 

  44. Antonelo, E. A. & Schrauwen, B. On learning navigation behaviors for small mobile robots with reservoir computing architectures. IEEE Trans. Neural Netw. Learn. Syst. 26, 763–780 (2014).

    MathSciNet  Article  Google Scholar 

  45. Jaeger, H. Short term memory in echo state networks. gmd-report 152. In GMD German National Research Institute for Computer Science (2002) (Citeseer, 2002); http://www.faculty.jacobs-university.de/hjaeger/pubs/STMEchoStatesTechRep.pdf

  46. Damoiseaux, J. et al. Consistent resting-state networks across healthy subjects. Proc. Natl Acad. Sci. USA 103, 13848–13853 (2006).

    Article  Google Scholar 

  47. Smith, S. M. et al. Correspondence of the brain’s functional architecture during activation and rest. Proc. Natl Acad. Sci. USA 106, 13040–13045 (2009).

    Article  Google Scholar 

  48. Bellec, P., Rosa-Neto, P., Lyttelton, O. C., Benali, H. & Evans, A. C. Multi-level bootstrap analysis of stable clusters in resting-state fMRI. NeuroImage 51, 1126–1139 (2010).

    Article  Google Scholar 

  49. Thomas Yeo, B. et al. The organization of the human cerebral cortex estimated by intrinsic functional connectivity. J. Neurophysiol. 106, 1125–1165 (2011).

    Article  Google Scholar 

  50. Schaefer, A. et al. Local-global parcellation of the human cerebral cortex from intrinsic functional connectivity MRI. Cereb. Cortex 28, 3095–3114 (2018).

    Article  Google Scholar 

  51. Uddin, L. Q., Yeo, B. T. & Spreng, R. N. Towards a universal taxonomy of macro-scale functional human brain networks. Brain Topogr. 32, 926–942 (2019).

    Article  Google Scholar 

  52. Bertolero, M. A., Yeo, B. T., Bassett, D. S. & D’Esposito, M. A mechanistic model of connector hubs, modularity and cognition. Nat. Hum. Behav. 2, 765–777 (2018).

    Article  Google Scholar 

  53. Buonomano, D. V. & Maass, W. State-dependent computations: spatiotemporal processing in cortical networks. Nat. Rev. Neurosci. 10, 113 (2009).

    Article  Google Scholar 

  54. Betzel, R. F., Griffa, A., Hagmann, P. & Mišić, B. Distance-dependent consensus thresholds for generating group-representative structural brain networks. Net. Neurosci. 3, 475–496 (2018).

    Article  Google Scholar 

  55. de Reus, M. A. & van den Heuvel, M. P. Estimating false positives and negatives in brain networks. NeuroImage 70, 402–409 (2013).

    Article  Google Scholar 

  56. Roberts, J. A., Perry, A., Roberts, G., Mitchell, P. B. & Breakspear, M. Consistency-based thresholding of the human connectome. NeuroImage 145, 118–129 (2017).

    Article  Google Scholar 

  57. Elman, J. L. Finding structure in time. Cogn. Sci. 14, 179–211 (1990).

    Article  Google Scholar 

  58. Elman, J. L. Distributed representations, simple recurrent networks, and grammatical structure. Mach. Learn. 7, 195–225 (1991).

    Google Scholar 

  59. Wig, G. S. Segregated systems of human brain networks. Trends Cogn. Sci. 21, 981–996 (2017).

    Article  Google Scholar 

  60. Büsing, L., Schrauwen, B. & Legenstein, R. Connectivity, dynamics, and memory in reservoir computing with binary and analog neurons. Neural Comput. 22, 1272–1311 (2010).

    MathSciNet  MATH  Article  Google Scholar 

  61. Mastrogiuseppe, F. & Ostojic, S. Linking connectivity, dynamics, and computations in low-rank recurrent neural networks. Neuron 99, 609–623 (2018).

    Article  Google Scholar 

  62. Seung, H. S. How the brain keeps the eyes still. Proc. Natl Acad. Sci. USA 93, 13339–13344 (1996).

    Article  Google Scholar 

  63. Maslov, S. & Sneppen, K. Specificity and stability in topology of protein networks. Science 296, 910–913 (2002).

    Article  Google Scholar 

  64. Rodriguez, N., Izquierdo, E. & Ahn, Y.-Y. Optimal modularity and memory capacity of neural reservoirs. Netw. Neurosci. 3, 551–566 (2019).

    Article  Google Scholar 

  65. Alexander-Bloch, A. F. et al. On testing for spatial correspondence between maps of human brain structure and function. NeuroImage 178, 540–551 (2018).

    Article  Google Scholar 

  66. Markello, R. & Misic, B. Comparing spatiall null models for brain maps. Neuroimage 236, 118052 (2021).

    Article  Google Scholar 

  67. Shine, J. M., Aburn, M. J., Breakspear, M. & Poldrack, R. A. The modulation of neural gain facilitates a transition between functional segregation and integration in the brain. eLife 7, e31130 (2018).

    Article  Google Scholar 

  68. Crossley, N. A. et al. Cognitive relevance of the community structure of the human brain functional coactivation network. Proc. Natl Acad. Sci. USA 110, 11583–11588 (2013).

    Article  Google Scholar 

  69. Stiso, J. & Bassett, D. S. Spatial embedding imposes constraints on neuronal network architectures. Trends Cogn. Sci. 22, 1127–1142 (2018).

    Article  Google Scholar 

  70. Horvát, S. et al. Spatial embedding and wiring cost constrain the functional layout of the cortical network of rodents and primates. PLoS Biol. 14, e1002512 (2016).

    Article  Google Scholar 

  71. Roberts, J. A. et al. The contribution of geometry to the human connectome. NeuroImage 124, 379–393 (2016).

    Article  Google Scholar 

  72. Betzel, R. F. & Bassett, D. S. Specificity and robustness of long-distance connections in weighted, interareal connectomes. Proc. Natl Acad. Sci. USA 115, E4880–E4889 (2018).

    Article  Google Scholar 

  73. Hopfield, J. J. Neural networks and physical systems with emergent collective computational abilities. Proc. Natl Acad. Sci. USA 79, 2554–2558 (1982).

    MathSciNet  MATH  Article  Google Scholar 

  74. Zylberberg, J. & Strowbridge, B. W. Mechanisms of persistent activity in cortical circuits: possible neural substrates for working memory. Ann. Rev. Neurosci. 40, 603–627 (2017).

    Article  Google Scholar 

  75. Scoville, W. B. & Milner, B. Loss of recent memory after bilateral hippocampal lesions. J. Neurol. Neurosurg. Psychiatry 20, 11–21 (1957).

    Article  Google Scholar 

  76. Wills, T. J., Lever, C., Cacucci, F., Burgess, N. & O’Keefe, J. Attractor dynamics in the hippocampal representation of the local environment. Science 308, 873–876 (2005).

    Article  Google Scholar 

  77. Neves, G., Cooke, S. F. & Bliss, T. V. Synaptic plasticity, memory and the hippocampus: a neural network approach to causality. Nat. Rev. Neurosci. 9, 65–75 (2008).

    Article  Google Scholar 

  78. Margulies, D. S. et al. Situating the default-mode network along a principal gradient of macroscale cortical organization. Proc. Natl Acad. Sci. USA 113, 12574–12579 (2016).

    Article  Google Scholar 

  79. Huntenburg, J. M., Bazin, P.-L. & Margulies, D. S. Large-scale gradients in human cortical organization. Trends Cogn. Sci. 22, 21–31 (2018).

    Article  Google Scholar 

  80. Mesulam, M. Neurocognitive networks and selectively distributed processing. Rev. Neurol. 150, 564–569 (1994).

    Google Scholar 

  81. Mesulam, M.-M. From sensation to cognition. Brain 121, 1013–1052 (1998).

    Article  Google Scholar 

  82. Taylor, P., Hobbs, J., Burroni, J. & Siegelmann, H. The global landscape of cognition: hierarchical aggregation as an organizational principle of human cortical networks and functions. Sci. Rep. 5, 1–18 (2015).

    Google Scholar 

  83. Shine, J. M., Li, M., Koyejo, O., Fulcher, B. & Lizier, J. T. Topological augmentation of latent information streams in feed-forward neural networks. Preprint at https://www.biorxiv.org/content/10.1101/2020.09.30.321679v1 (2020).

  84. Sporns, O. Network attributes for segregation and integration in the human brain. Curr. Opin. Neurobiol. 23, 162–171 (2013).

    Article  Google Scholar 

  85. Sporns, O., Chialvo, D. R., Kaiser, M. & Hilgetag, C. C. Organization, development and function of complex brain networks. Trends Cogn. Sci. 8, 418–425 (2004).

    Article  Google Scholar 

  86. Dohmatob, E., Dumas, G. & Bzdok, D. Dark control: the default mode network as a reinforcement learning agent. Hum. Brain Mapp. 41, 3318–3341 (2020).

    Article  Google Scholar 

  87. Sherman, S. M. & Guillery, R. The role of the thalamus in the flow of information to the cortex. Philos. Trans. R. Soc. B 357, 1695–1708 (2002).

    Article  Google Scholar 

  88. Bassett, D. S., Brown, J. A., Deshpande, V., Carlson, J. M. & Grafton, S. T. Conserved and variable architecture of human white matter connectivity. NeuroImage 54, 1262–1279 (2011).

    Article  Google Scholar 

  89. Rubinov, M. & Sporns, O. Complex network measures of brain connectivity: uses and interpretations. NeuroImage 52, 1059–1069 (2010).

    Article  Google Scholar 

  90. Damicelli, F., Hilgetag, C. C. & Goulas, A. Brain connectivity meets reservoir computing. Preprint at https://www.biorxiv.org/content/10.1101/2021.01.22.427750v1 (2021).

  91. Goulas, A., Damicelli, F. & Hilgetag, C. C. Bio-instantiated recurrent neural networks. Preprint at https://www.biorxiv.org/content/10.1101/2021.01.22.427744v2 (2021).

  92. Haeusler, S. & Maass, W. A statistical analysis of information-processing properties of lamina-specific cortical microcircuit models. Cereb. Cortex 17, 149–162 (2007).

    Article  Google Scholar 

  93. Shew, W. L., Yang, H., Petermann, T., Roy, R. & Plenz, D. Neuronal avalanches imply maximum dynamic range in cortical networks at criticality. J. Neurosci. 29, 15595–15600 (2009).

    Article  Google Scholar 

  94. Haldeman, C. & Beggs, J. M. Critical branching captures activity in living neural networks and maximizes the number of metastable states. Phys. Rev. Lett. 94, 058101 (2005).

    Article  Google Scholar 

  95. Vázquez-Rodríguez, B. et al. Stochastic resonance at criticality in a network model of the human cortex. Sci. Rep. 7, 1–12 (2017).

    Article  Google Scholar 

  96. Tagliazucchi, E., Balenzuela, P., Fraiman, D. & Chialvo, D. R. Criticality in large-scale brain fMRI dynamics unveiled by a novel point process analysis. Front. Physiol. 3, 15 (2012).

    Article  Google Scholar 

  97. Kitzbichler, M. G., Smith, M. L., Christensen, S. R. & Bullmore, E. Broadband criticality of human brain network synchronization. PLoS Comput. Biol. 5, e1000314 (2009).

    MathSciNet  Article  Google Scholar 

  98. Cocchi, L., Gollo, L. L., Zalesky, A. & Breakspear, M. Criticality in the brain: a synthesis of neurobiology, models and cognition. Prog. Neurobiol. 158, 132–152 (2017).

    Article  Google Scholar 

  99. Deco, G. & Jirsa, V. K. Ongoing cortical activity at rest: criticality, multistability, and ghost attractors. J. Neurosci. 32, 3366–3375 (2012).

    Article  Google Scholar 

  100. Langton, C. Computation at the edge of chaos: phase transition and emergent computation. Physica D 42, 12–37 (1990).

    MathSciNet  Article  Google Scholar 

  101. Bertschinger, N. & Natschläger, T. Real-time computation at the edge of chaos in recurrent neural networks. Neural Comput. 16, 1413–1436 (2004).

    MATH  Article  Google Scholar 

  102. Legenstein, R. & Maass, W. in New Directions in Statistical Signal Processing: From Systems to Brains 127–154 (IEEE, 2007).

  103. Legenstein, R. & Maass, W. Edge of chaos and prediction of computational performance for neural circuit models. Neural Netw. 20, 323–334 (2007).

    MATH  Article  Google Scholar 

  104. Maass, W. & Markram, H. On the computational power of circuits of spiking neurons. J. Comput. Syst. Sci. 69, 593–616 (2004).

    MathSciNet  MATH  Article  Google Scholar 

  105. Betzel, R. F. et al. Generative models of the human connectome. NeuroImage 124, 1054–1064 (2016).

    Article  Google Scholar 

  106. Bassett, D. S. et al. Efficient physical embedding of topologically complex information processing networks in brains and computer circuits. PLoS Comput. Biol. 6, e1000748 (2010).

    Article  Google Scholar 

  107. Tanaka, G. et al. Recent advances in physical reservoir computing: a review. Neural Netw. 115, 100–123 (2019).

    Article  Google Scholar 

  108. Appeltant, L. et al. Information processing using a single dynamical node as complex system. Nat. Commun. 2, 1–6 (2011).

    Article  Google Scholar 

  109. Soriano, M. C. et al. Delay-based reservoir computing: noise effects in a combined analog and digital implementation. IEEE Trans. Neural Netw. Learn. Syst. 26, 388–393 (2014).

    MathSciNet  Article  Google Scholar 

  110. Li, J., Bai, K., Liu, L. & Yi, Y. A deep learning based approach for analog hardware implementation of delayed feedback reservoir computing system. In 2018 19th International Symposium on Quality Electronic Design (ISQED) 308–313 (IEEE, 2018).

  111. Zhao, C. et al. Novel spike based reservoir node design with high performance spike delay loop. In Proc. 3rd ACM International Conference on Nanoscale Computing and Communication 1–5 (ACM, 2016).

  112. Antonik, P. Application of FPGA to Real-Time Machine Learning: Hardware Reservoir Computers and Software Image Processing (Springer, 2018).

  113. Alomar, M. L., Canals, V., Martínez-Moll, V. & Rosselló, J. L. Low-cost hardware implementation of reservoir computers. In 2014 24th International Workshop on Power and Timing Modeling, Optimization and Simulation (PATMOS) 1–5 (IEEE, 2014).

  114. Antonik, P., Smerieri, A., Duport, F., Haelterman, M. & Massar, S. FPGA implementation of reservoir computing with online learning. In 24th BelgianDutch Conference on Machine Learning (Springer, 2015).

  115. Wang, Q., Li, Y., Shao, B., Dey, S. & Li, P. Energy efficient parallel neuromorphic architectures with approximate arithmetic on FPGA. Neurocomputing 221, 146–158 (2017).

    Article  Google Scholar 

  116. Petre, P. & Cruz-Albrecht, J. Neuromorphic mixed-signal circuitry for asynchronous pulse processing. In 2016 IEEE International Conference on Rebooting Computing (ICRC) 1–4 (IEEE, 2016).

  117. Polepalli, A., Soures, N. & Kudithipudi, D. Digital neuromorphic design of a liquid state machine for real-time processing. In 2016 IEEE International Conference on Rebooting Computing (ICRC) 1–8 (IEEE, 2016).

  118. Roy, S., Banerjee, A. & Basu, A. Liquid state machine with dendritically enhanced readout for low-power, neuromorphic VLSI implementations. IEEE Trans. Biomed. Circuits Syst. 8, 681–695 (2014).

    Article  Google Scholar 

  119. Yang, X., Chen, W. & Wang, F. Z. Investigations of the staircase memristor model and applications of memristor-based local connections. Analog Integr. Circuits Signal Process. 87, 263–273 (2016).

    Article  Google Scholar 

  120. Bennett, C. H., Querlioz, D. & Klein, J.-O. Spatio-temporal learning with arrays of analog nanosynapses. In 2017 IEEE/ACM International Symposium on Nanoscale Architectures (NANOARCH) 125–130 (IEEE, 2017).

  121. Kulkarni, M. S. & Teuscher, C. Memristor-based reservoir computing. In 2012 IEEE/ACM International Symposium on Nanoscale Architectures (NANOARCH) 226–232 (IEEE, 2012).

  122. Du, C. et al. Reservoir computing using dynamic memristors for temporal information processing. Nat. Commun. 8, 1–10 (2017).

    Article  Google Scholar 

  123. Sillin, H. O. et al. A theoretical and experimental study of neuromorphic atomic switch networks for reservoir computing. Nanotechnology 24, 384004 (2013).

    Article  Google Scholar 

  124. Kendall, J. D., Nino, J. C. & Suárez, L. E. Deep learning in bipartite memristive networks. US patent 15/985,212 (2018).

  125. Suárez, L. E., Kendall, J. D. & Nino, J. C. Evaluation of the computational capabilities of a memristive random network (mn3) under the context of reservoir computing. Neural Netw. 106, 223–236 (2018).

    Article  Google Scholar 

  126. Vandoorne, K. et al. Toward optical signal processing using photonic reservoir computing. Opt. Exp 16, 11182–11192 (2008).

    Article  Google Scholar 

  127. Vandoorne, K. et al. Experimental demonstration of reservoir computing on a silicon photonics chip. Nat. Commun. 5, 1–6 (2014).

    Article  Google Scholar 

  128. Zhang, H. et al. Integrated photonic reservoir computing based on hierarchical time-multiplexing structure. Opt. Exp. 22, 31356–31370 (2014).

    Article  Google Scholar 

  129. Katumba, A., Freiberger, M., Bienstman, P. & Dambre, J. A multiple-input strategy to efficient integrated photonic reservoir computing. Cogn. Comput. 9, 307–314 (2017).

    Article  Google Scholar 

  130. Katumba, A. et al. Low-loss photonic reservoir computing with multimode photonic integrated circuits. Sci. Rep. 8, 1–10 (2018).

    Article  Google Scholar 

  131. Laporte, F., Katumba, A., Dambre, J. & Bienstman, P. Numerical demonstration of neuromorphic computing with photonic crystal cavities. Opt. Exp. 26, 7955–7964 (2018).

    Article  Google Scholar 

  132. Mišić, B. et al. Network-level structure–function relationships in human neocortex. Cereb. Cortex 26, 3285–3296 (2016).

    Article  Google Scholar 

  133. Messé, A., Rudrauf, D., Benali, H. & Marrelec, G. Relating structure and function in the human brain: relative contributions of anatomy, stationary dynamics, and non-stationarities. PLoS Comput. Biol. 10, e1003530 (2014).

    Article  Google Scholar 

  134. Graham, D. & Rockmore, D. The packet switching brain. J. Cogn. Neurosci. 23, 267–276 (2011).

    Article  Google Scholar 

  135. Goñi, J. et al. Resting-brain functional connectivity predicted by analytic measures of network communication. Proc. Natl Acad. Sci. USA 111, 833–838 (2014).

    Article  Google Scholar 

  136. Mišić, B. et al. Cooperative and competitive spreading dynamics on the human connectome. Neuron 86, 1518–1529 (2015).

    Article  Google Scholar 

  137. Crofts, J. J. & Higham, D. J. A weighted communicability measure applied to complex brain networks. J. R. Soc. Interface 6, 411–414 (2009).

    Article  Google Scholar 

  138. Honey, C. J., Kötter, R., Breakspear, M. & Sporns, O. Network structure of cerebral cortex shapes functional connectivity on multiple time scales. Proc. Natl Acad. Sci. USA 104, 10240–10245 (2007).

    Article  Google Scholar 

  139. Sanz-Leon, P., Knock, S. A., Spiegler, A. & Jirsa, V. K. Mathematical framework for large-scale brain network modeling in the virtual brain. NeuroImage 111, 385–430 (2015).

    Article  Google Scholar 

  140. Deco, G., Jirsa, V., McIntosh, A. R., Sporns, O. & Kötter, R. Key role of coupling, delay, and noise in resting brain fluctuations. Proc. Natl Acad. Sci. USA 106, 10302–10307 (2009).

    Article  Google Scholar 

  141. Honey, C. J., Thivierge, J.-P. & Sporns, O. Can structure predict function in the human brain? NeuroImage 52, 766–776 (2010).

    Article  Google Scholar 

  142. Aceituno, P. V., Yan, G. & Liu, Y.-Y. Tailoring echo state networks for optimal learning. iScience 23, 101440 (2020).

    Article  Google Scholar 

  143. Sporns, O. & Kötter, R. Motifs in brain networks. PLoS Biol. 2, e369 (2004).

    Article  Google Scholar 

  144. Shen, K. et al. Information processing architecture of functionally defined clusters in the macaque cortex. J. Neurosci. 32, 17465–17476 (2012).

    Article  Google Scholar 

  145. Bettinardi, R. G. et al. How structure sculpts function: unveiling the contribution of anatomical connectivity to the brain’s spontaneous correlation structure. Chaos 27, 047409 (2017).

    MathSciNet  Article  Google Scholar 

  146. Sizemore, A. E. et al. Cliques and cavities in the human connectome. J. Comput. Neurosci. 44, 115–145 (2018).

    MathSciNet  MATH  Article  Google Scholar 

  147. Medaglia, J. D. et al. Functional alignment with anatomical networks is associated with cognitive flexibility. Nat. Hum. Behav. 2, 156–164 (2018).

    Article  Google Scholar 

  148. Haimovici, A., Tagliazucchi, E., Balenzuela, P. & Chialvo, D. R. Brain organization into resting state networks emerges at criticality on a model of the human connectome. Phys. Rev. Lett. 110, 178101 (2013).

    Article  Google Scholar 

  149. Poldrack, R. A. & Yarkoni, T. From brain maps to cognitive ontologies: informatics and the search for mental structure. Ann. Rev. Psychol. 67, 587–612 (2016).

    Article  Google Scholar 

  150. Yarkoni, T., Poldrack, R. A., Nichols, T. E., Van Essen, D. C. & Wager, T. D. Large-scale automated synthesis of human functional neuroimaging data. Nat. Methods 8, 665–670 (2011).

    Article  Google Scholar 

  151. Dockès, J. et al. Neuroquery, comprehensive meta-analysis of human brain mapping. eLife 9, e53385 (2020).

    Article  Google Scholar 

  152. Fox, P. T. & Lancaster, J. L. Mapping context and content: the brainmap model. Nat. Rev. Neurosci. 3, 319–321 (2002).

    Article  Google Scholar 

  153. Demirtaş, M. et al. Hierarchical heterogeneity across human cortex shapes large-scale neural dynamics. Neuron 101, 1181–1194 (2019).

    Article  Google Scholar 

  154. Wang, P. et al. Inversion of a large-scale circuit model reveals a cortical hierarchy in the dynamic resting human brain. Sci. Adv. 5, eaat7854 (2019).

    Article  Google Scholar 

  155. Thomas, C. et al. Anatomical accuracy of brain connections derived from diffusion MRI tractography is inherently limited. Proc. Natl Acad. Sci. USA 111, 16574–16579 (2014).

    Article  Google Scholar 

  156. Maier-Hein, K. H. et al. The challenge of mapping the human connectome based on diffusion tractography. Nat. Commun. 8, 1349 (2017).

    Article  Google Scholar 

  157. Desikan, R. S. et al. An automated labeling system for subdividing the human cerebral cortex on MRI scans into gyral based regions of interest. NeuroImage 31, 968–980 (2006).

    Article  Google Scholar 

  158. Cammoun, L. et al. Mapping the human connectome at multiple scales with diffusion spectrum MRI. J. Neurosci. Meth. 203, 386–397 (2012).

    Article  Google Scholar 

  159. Daducci, A. et al. The connectome mapper: an open-source processing pipeline to map connectomes with MRI. PLoS ONE 7, e48121 (2012).

    Article  Google Scholar 

  160. Jones, D., Knösche, T. & Turner, R. White matter integrity, fiber count, and other fallacies: the do’s and don’ts of diffusion MRI. NeuroImage 73, 239–254 (2013).

    Article  Google Scholar 

  161. Zalesky, A. et al. Connectome sensitivity or specificity: which is more important? NeuroImage 142, 407–420 (2016).

    Article  Google Scholar 

  162. Cole, M. W., Bassett, D. S., Power, J. D., Braver, T. S. & Petersen, S. E. Intrinsic and task-evoked network architectures of the human brain. Neuron 83, 238–251 (2014).

    Article  Google Scholar 

  163. Boedecker, J., Obst, O., Lizier, J. T., Mayer, N. M. & Asada, M. Information processing in echo state networks at the edge of chaos. Theory Biosci. 131, 205–213 (2012).

    Article  Google Scholar 

  164. Knock, S. et al. The effects of physiologically plausible connectivity structure on local and global dynamics in large scale brain models. J Neurosci. Meth. 183, 86–94 (2009).

    Article  Google Scholar 

  165. Onnela, J.-P., Saramäki, J., Kertész, J. & Kaski, K. Intensity and coherence of motifs in weighted complex networks. Phys. Rev. E 71, 065103 (2005).

    Article  Google Scholar 

  166. Brandes, U. A faster algorithm for betweenness centrality. J. Math. Sociol. 25, 163–177 (2001).

    MATH  Article  Google Scholar 

  167. Guimera, R. & Amaral, L. A. N. Functional cartography of complex metabolic networks. Nature 433, 895–900 (2005).

    Article  Google Scholar 

  168. Dijkstra, E. W. et al. A note on two problems in connexion with graphs. Num. Math. 1, 269–271 (1959).

    MathSciNet  MATH  Article  Google Scholar 

  169. Leicht, E. A. & Newman, M. E. Community structure in directed networks. Phys. Rev. Lett. 100, 118703 (2008).

    Article  Google Scholar 

  170. Reichardt, J. & Bornholdt, S. When are networks truly modular? Physica D 224, 20–26 (2006).

    MathSciNet  MATH  Article  Google Scholar 

  171. Good, B. H., De Montjoye, Y.-A. & Clauset, A. Performance of modularity maximization in practical contexts. Phys. Rev. E 81, 046106 (2010).

    MathSciNet  Article  Google Scholar 

  172. Suárez, L. E. Code for Learning Function from Structure in Neuromorphic Networks (Zenodo, 2021); https://doi.org/10.5281/zenodo.4776829

  173. Harris, C. R. et al. Array programming with numpy. Nature 585, 357–362 (2020).

    Article  Google Scholar 

  174. Walt, Svd, Colbert, S. C. & Varoquaux, G. The NumPy array: a structure for efficient numerical computation. Comput. Sci. Eng. 13, 22–30 (2011).

    Article  Google Scholar 

  175. Oliphant, T. E. A Guide to NumPy Vol. 1 (Trelgol, 2006).

  176. Virtanen, P. et al. SciPy 1.0: fundamental algorithms for scientific computing in Python. Nat. Methods 17, 261–272 (2020).

    Article  Google Scholar 

  177. McKinney, W. et al. Data structures for statistical computing in Python. In Proc. 9th Python in Science Conference Vol. 445, 51–56 (2010).

  178. Pedregosa, F. et al. Scikit-learn: machine learning in Python. J. Mach. Learn. Res. 12, 2825–2830 (2011).

    MathSciNet  MATH  Google Scholar 

  179. Hagberg, A., Swart, P. & S Chult, D. Exploring Network Structure, Dynamics, and Function Using NetworkX (US Department of Energy, 2008).

  180. Hunter, J. D. Matplotlib: a 2D graphics environment. Comput. Sci. Eng. 9, 90–95 (2007).

    Article  Google Scholar 

  181. Waskom, M. et al. Seaborn v0.7.0 (Zenodo, 2016).

Download references

Acknowledgements

We thank R. Markello, B. Vazquez-Rodriguez, G. Shafiei, V. Bazinet, J. Hansen and Z.-Q. Liu for insightful comments on the manuscript. B.M. acknowledges support from the Natural Sciences and Engineering Research Council of Canada (NSERC Discovery Grant RGPIN no. 017-04265), from the Canada Research Chairs Program, from the Canada First Research Excellence Fund, awarded to McGill University for the Healthy Brains for Healthy Lives initiative, and from the Brain Canada Future Leaders Fund. G.L. acknowledges support from NSERC (Discovery Grant RGPIN-2018-04821), FRQS (Research Scholar Award Junior 1 LAJGU0401-253188), and CIFAR (Canada CIFAR AI Chair). B.A.R. acknowledges support from NSERC (NSERC Discovery Grant RGPIN 2020-05105), Healthy Brains, Healthy Lives (New Investigator Start-up, 2b-NISU-8), and funding from CIFAR (Learning in Machines and Brains Program, Canada CIFAR AI Chair). E.S. acknowledges support from the Fonds de Recherche du Québec—Nature et Technologies (FRQNT) Strategic Clusters Program (2020-RS4-265502—Centre UNIQUE—Union Neurosciences and Artificial Intelligence—Quebec) and the Fonds de Recherche du Québec—Nature et Technologies (FRQNT).

Author information

Authors and Affiliations

Authors

Contributions

L.E.S., G.L. and B.M. conceptualized the work. L.E.S. performed the methodology and the formal analysis. L.E.S. and B.M. wrote the original draft, whereas B.A.R. and G.L. reviewed and edited the manuscript. B.M. acquired funding.

Corresponding author

Correspondence to Bratislav Misic.

Ethics declarations

Competing interests

The authors declare no competing interests.

Additional information

Peer review informationNature Machine Intelligence thanks Nabil Imam and the other, anonymous, reviewer(s) for their contribution to the peer review of this work.

Publisher’s note Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary information

Supplementary Information

Supplementary Figs. 1–10 and Table 1.

Reporting Summary

Rights and permissions

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Suárez, L.E., Richards, B.A., Lajoie, G. et al. Learning function from structure in neuromorphic networks. Nat Mach Intell 3, 771–786 (2021). https://doi.org/10.1038/s42256-021-00376-1

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1038/s42256-021-00376-1

Further reading

Search

Quick links

Nature Briefing

Sign up for the Nature Briefing newsletter — what matters in science, free to your inbox daily.

Get the most important science stories of the day, free in your inbox. Sign up for Nature Briefing