Skip to main content

Thank you for visiting nature.com. You are using a browser version with limited support for CSS. To obtain the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles and JavaScript.

  • Article
  • Published:

A deep-learning approach to realizing functionality in nanoelectronic devices

Abstract

Many nanoscale devices require precise optimization to function. Tuning them to the desired operation regime becomes increasingly difficult and time-consuming when the number of terminals and couplings grows. Imperfections and device-to-device variations hinder optimization that uses physics-based models. Deep neural networks (DNNs) can model various complex physical phenomena but, so far, are mainly used as predictive tools. Here, we propose a generic deep-learning approach to efficiently optimize complex, multi-terminal nanoelectronic devices for desired functionality. We demonstrate our approach for realizing functionality in a disordered network of dopant atoms in silicon. We model the input–output characteristics of the device with a DNN, and subsequently optimize control parameters in the DNN model through gradient descent to realize various classification tasks. When the corresponding control settings are applied to the physical device, the resulting functionality is as predicted by the DNN model. We expect our approach to contribute to fast, in situ optimization of complex (quantum) nanoelectronic devices.

This is a preview of subscription content, access via your institution

Access options

Buy this article

Prices may be subject to local taxes which are calculated during checkout

Fig. 1: Realizing functionality in a nanoelectronic device by using a DNN model.
Fig. 2: Sampling input–output data to train and test the DNN.
Fig. 3: DNN prediction of device functionality and verification.
Fig. 4: Prediction and verification of Boolean logic.
Fig. 5: Ring classification functionality.
Fig. 6: Feature mapping task.

Similar content being viewed by others

Data availability

Data are available from the public repository https://data.4tu.nl at https://doi.org/10.4121/12884804.

Code availability

The custom computer code used here is available under the GNU General Public License v3.0 at https://github.com/BraiNEdarwin/SkyNEt.

References

  1. Baart, T. A., Eendebak, P. T., Reichl, C., Wegscheider, W. & Vandersypen, L. M. K. Computer-automated tuning of semiconductor double quantum dots into the single-electron regime. Appl. Phys. Lett. 108, 213104 (2016).

    Article  Google Scholar 

  2. Kalantre, S. S. et al. Machine learning techniques for state recognition and auto-tuning in quantum dots. Npj Quantum Inf. 5, 6 (2019).

    Article  Google Scholar 

  3. Botzem, T. et al. Tuning methods for semiconductor spin qubits. Phys. Rev. Appl. 10, 054026 (2018).

    Article  CAS  Google Scholar 

  4. van Diepen, C. J. et al. Automated tuning of inter-dot tunnel coupling in double quantum dots. Appl. Phys. Lett. 113, 033101 (2018).

    Article  Google Scholar 

  5. Teske, J. D. et al. A machine learning approach for automated fine-tuning of semiconductor spin qubits. Appl. Phys. Lett. 114, 133102 (2019).

    Article  Google Scholar 

  6. Chen, T. et al. Classification with a disordered dopant-atom network in silicon. Nature 577, 341–345 (2020).

    Article  CAS  Google Scholar 

  7. Bose, S. K. et al. Evolution of a designless nanoparticle network into reconfigurable Boolean logic. Nat. Nanotechnol. 10, 1048–1052 (2015).

    Article  CAS  Google Scholar 

  8. Lykkebø, O. R., Nichele, S. & Tufte, G. An investigation of square waves for evolution in carbon nanotubes material. In Proc. 13th European Conference on Artificial Life (ECAL) 503–510 (MIT Press, 2015).

  9. Miller, J. F., Harding, S. L. & Tufte, G. Evolution-in-materio: evolving computation in materials. Evol. Intell. 7, 49–67 (2014).

    Article  Google Scholar 

  10. Stepney, S. The neglected pillar of material computation. Physica D 237, 1157–1164 (2008).

    Article  Google Scholar 

  11. Zwolak, J. P. et al. Autotuning of double-dot devices in situ with machine learning. Phys. Rev. Appl. 13, 034075 (2020).

    Article  CAS  Google Scholar 

  12. Lennon, D. T. et al. Efficiently measuring a quantum device using machine learning. Npj Quantum Inf. 5, 79 (2019).

    Article  Google Scholar 

  13. Durrer, R. et al. Automated tuning of double quantum dots into specific charge states using neural networks. Phys. Rev. Appl. 13, 054019 (2020).

    Article  CAS  Google Scholar 

  14. Lapointe-Major, M. et al. Algorithm for automated tuning of a quantum dot into the single-electron regime. Phys. Rev. B 102, 085301 (2020).

    Article  CAS  Google Scholar 

  15. Darulová, J. et al. Autonomous tuning and charge-state detection of gate-defined quantum dots. Phys. Rev. Appl. 13, 054005 (2020).

    Article  Google Scholar 

  16. LeCun, Y., Bengio, Y. & Hinton, G. Deep learning. Nature 521, 436–444 (2015).

    Article  CAS  Google Scholar 

  17. Hornik, K., Stinchcombe, M. & White, H. Multilayer feedforward networks are universal approximators. Neural Netw. 2, 359–366 (1989).

    Article  Google Scholar 

  18. Werbos, P. Beyond Regression: New Tools for Prediction and Analysis in the Behavioral Sciences. PhD dissertation, Harvard Univ. (1974).

  19. Rumelhart, D. E., Hinton, G. E. & Williams, R. J. Learning Internal Representations by Error Propagation Report 8506 (Institute for Cognitive Science, University of California, San Diego, 1985).

  20. LeCun, Y., Bottou, L., Bengio, Y. & Haffner, P. Gradient-based learning applied to document recognition. Proc. IEEE 86, 2278–2324 (1998).

    Article  Google Scholar 

  21. Ghiringhelli, L. M., Vybiral, J., Levchenko, S. V., Draxl, C. & Scheffler, M. Big data of materials science: critical role of the descriptor. Phys. Rev. Lett. 114, 105503 (2015).

    Article  Google Scholar 

  22. Kalinin, S. V., Sumpter, B. G. & Archibald, R. K. Big–deep–smart data in imaging for guiding materials design. Nat. Mater. 14, 973–980 (2015).

    Article  CAS  Google Scholar 

  23. Butler, K. T., Davies, D. W., Cartwright, H., Isayev, O. & Walsh, A. Machine learning for molecular and materials science. Nature 559, 547–555 (2018).

    Article  CAS  Google Scholar 

  24. Carrasquilla, J. & Melko, R. G. Machine learning phases of matter. Nat. Phys. 13, 431–434 (2017).

    Article  CAS  Google Scholar 

  25. Arsenault, L.-F., Lopez-Bezanilla, A., von Lilienfeld, O. A. & Millis, A. J. Machine learning for many-body physics: the case of the Anderson impurity model. Phys. Rev. B 90, 155136 (2014).

    Article  Google Scholar 

  26. Cover, T. M. Geometrical and statistical properties of systems of linear inequalities with applications in pattern recognition. IEEE Trans. Electron. Comput. EC-14, 326–334 (1965).

    Article  Google Scholar 

  27. Miller, J. F. & Downing, K. Evolution in materio: looking beyond the silicon box. In Proc. 2002 NASA/DoD Conference on Evolvable Hardware 167–176 (IEEE, 2002).

  28. Rosenblatt, F. The perceptron: a probabilistic model for information storage and organization in the brain. Psychol. Rev. 65, 386–408 (1958).

    Article  CAS  Google Scholar 

  29. Li, R. et al. A crossbar network for silicon quantum dot qubits. Sci. Adv. 4, eaar3960 (2018).

    Article  Google Scholar 

  30. Hill, C. D. et al. A surface code quantum computer in silicon. Sci. Adv. 1, e1500707 (2015).

    Article  Google Scholar 

  31. Veldhorst, M., Eenink, H. G. J., Yang, C. H. & Dzurak, A. S. Silicon CMOS architecture for a spin-based quantum computer. Nat. Commun. 8, 1766 (2017).

    Article  CAS  Google Scholar 

  32. van Esbroeck, N. M. et al. Quantum device fine-tuning using unsupervised embedding learning. New J. Phys. https://doi.org/10.1088/1367-2630/abb64c (2020).

  33. Moon, H. et al. Machine learning enables completely automatic tuning of a quantum device faster than human experts. Nat. Commun. 11, 4161 (2020).

    Article  CAS  Google Scholar 

  34. Darulova, J., Troyer, M. & Cassidy, M. C. Evaluation of synthetic and experimental training data in supervised machine learning applied to charge state detection of quantum dots. Preprint at https://arxiv.org/abs/2005.08131 (2020).

  35. Tsilipakos, O. et al. Toward intelligent metasurfaces: the progress from globally tunable metasurfaces to software-defined metasurfaces with an embedded network of controllers. Adv. Opt. Mater. 8, 2000783 (2020).

    Article  CAS  Google Scholar 

  36. NI-DAQmx Python documentation (National Instruments Corp., 2017) https://nidaqmx-python.readthedocs.io/en/latest

  37. SkyNEt library (Darwin team of the NanoElectronics group, Univ. of Twente, 2020) https://github.com/BraiNEdarwin/SkyNEt

  38. Cheng, Y., Wang, D., Zhou, P. & Zhang, T. A survey of model compression and acceleration for deep neural networks. IEEE Signal Process. Mag. https://arxiv.org/abs/1710.09282 (2017).

  39. Kingma, D. P. & Ba, J. L. Adam: a method for stochastic optimization. In Proc. 3rd International Conference for Learning Representations (ICLR). https://arxiv.org/abs/1412.6980 (2015).

  40. Paszke, A. et al. PyTorch: an imperative style, high-performance deep learning library. In Proc. 33rd Conference on Neural Information Processing Systems (NeurIPS) 8024–8035 (2019).

  41. Pedregosa, F. et al. Scikit-learn: machine learning in Python. J. Mach. Learn. Res. 12, 2825–2830 (2011).

    Google Scholar 

Download references

Acknowledgements

We thank B. J. Geurts, U. Alegre Ibarra, B. de Wilde and L. J. Knoll for fruitful discussions. We are grateful to U. Alegre Ibarra for reading the manuscript carefully and providing useful input. We thank M. H. Siekman and J. G. M. Sanderink for technical support. We acknowledge financial support from the University of Twente, the Dutch Research Council (NWA Startimpuls grant no. 400-17-607) and the Natuurkunde Projectruimte (grant no. 680-91-114).

Author information

Authors and Affiliations

Authors

Contributions

H.-C.R.E., M.N.B. and J.T.W. performed the measurements and the DNN modelling. B.v.d.V. and T.C. fabricated the samples. H.-C.R.E., M.N.B., J.T.W., P.A.B. and W.G.v.d.W. wrote the manuscript and all of the authors contributed to revisions. W.G.v.d.W. and H.-C.R.E. conceived the project and designed the experiments with input from M.N.B. and J.T.W. W.G.v.d.W., P.A.B., H.B. and H.-C.R.E. supervised the project.

Corresponding author

Correspondence to Wilfred G. van der Wiel.

Ethics declarations

Competing interests

The authors declare no competing interests.

Additional information

Peer review information Nature Nanotechnology thanks Matthew Dale, Gunnar Tufte and the other, anonymous, reviewer(s) for their contribution to the peer review of this work.

Publisher’s note Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary information

Supplementary Information

Supplementary Figs. 1–6, Sections 1–4 and Tables 1–10.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Ruiz Euler, HC., Boon, M.N., Wildeboer, J.T. et al. A deep-learning approach to realizing functionality in nanoelectronic devices. Nat. Nanotechnol. 15, 992–998 (2020). https://doi.org/10.1038/s41565-020-00779-y

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1038/s41565-020-00779-y

This article is cited by

Search

Quick links

Nature Briefing

Sign up for the Nature Briefing newsletter — what matters in science, free to your inbox daily.

Get the most important science stories of the day, free in your inbox. Sign up for Nature Briefing