Review Article | Published:

Probabilistic machine learning and artificial intelligence

Nature volume 521, pages 452459 (28 May 2015) | Download Citation

Abstract

How can a machine learn from experience? Probabilistic modelling provides a framework for understanding what learning is, and has therefore emerged as one of the principal theoretical and practical approaches for designing machines that learn from data acquired through experience. The probabilistic framework, which describes how to represent and manipulate uncertainty about models and predictions, has a central role in scientific data analysis, machine learning, robotics, cognitive science and artificial intelligence. This Review provides an introduction to this framework, and discusses some of the state-of-the-art advances in the field, namely, probabilistic programming, Bayesian optimization, data compression and automatic model discovery.

Access optionsAccess options

Rent or Buy article

Get time limited or full article access on ReadCube.

from$8.99

All prices are NET prices.

References

  1. 1.

    & Artificial Intelligence: a Modern Approach (Prentice–Hall, 1995).

  2. 2.

    , & Probabilistic Robotics (MIT Press, 2006).

  3. 3.

    Pattern Recognition and Machine Learning (Springer, 2006).

  4. 4.

    Machine Learning: A Probabilistic Perspective (MIT Press, 2012).

  5. 5.

    et al. Deep neural networks for acoustic modeling in speech recognition: the shared views of four research groups. IEEE Signal Process. Mag. 29, 82–97 (2012).

  6. 6.

    , & ImageNet classification with deep convolutional neural networks. In Proc. Advances in Neural Information Processing Systems 25 1097–1105 (2012).

  7. 7.

    et al. Overfeat: integrated recognition, localization and detection using convolutional networks. In Proc. International Conference on Learning Representations (2014).

  8. 8.

    , , & A neural probabilistic language model. J. Mach. Learn. Res. 3, 1137–1155 (2003).

  9. 9.

    Bayesian nonparametrics and the probabilistic approach to modelling. Phil. Trans. R. Soc. A 371, 20110553 (2013). A review of Bayesian non-parametric modelling written for a general scientific audience.

  10. 10.

    Probability Theory: the Logic of Science (Cambridge Univ. Press, 2003).

  11. 11.

    & Probabilistic Graphical Models: Principles and Techniques (MIT Press, 2009). This is an encyclopaedic text on probabilistic graphical models spanning many key topics.

  12. 12.

    The Algebra of Probable Inference (Johns Hopkins Univ. Press, 1961).

  13. 13.

    Constructing a logic of plausible inference: a guide to Cox's theorem. Int. J. Approx. Reason. 34, 3–24 (2003).

  14. 14.

    La prévision: ses lois logiques, ses sources subjectives. In Annales de l'institut Henri Poincaré [in French] 7, 1–68 (1937).

  15. 15.

    & Perception as Bayesian inference (Cambridge Univ.Press, 1996).

  16. 16.

    & Optimal predictions in everyday cognition. Psychol. Sci. 17, 767–773 (2006).

  17. 17.

    , & An internal model for sensorimotor integration. Science 269, 1880–1882 (1995).

  18. 18.

    , , & How to grow a mind: statistics, structure, and abstraction. Science 331, 1279–1285 (2011).

  19. 19.

    & How robust are probabilistic models of higher-level cognition? Psychol. Sci. 24, 2351–2360 (2013).

  20. 20.

    et al. Relevant and robust a response to Marcus and Davis (2013). Psychol. Sci. 26, 539–541 (2015).

  21. 21.

    , , & Bayesian Brain: Probabilistic Approaches to Neural Coding (MIT Press, 2007).

  22. 22.

    Bayesian spiking neurons I: inference. Neural Comput. 20, 91–117 (2008).

  23. 23.

    Probabilistic Inference Using Markov Chain Monte Carlo Methods. Report No. CRG-TR-93–1 (Univ. Toronto, 1993).

  24. 24.

    , , & An introduction to variational methods in graphical models. Mach. Learn. 37, 183–233 (1999).

  25. 25.

    , & Sequential Monte Carlo Methods in Practice (Springer, 2000).

  26. 26.

    Expectation propagation for approximate Bayesian inference. In Proc. Uncertainty in Artificial Intelligence 17 362–369 (2001).

  27. 27.

    In Handbook of Markov Chain Monte Carlo (eds Brooks, S., Gelman, A., Jones, G. & Meng, X.-L.) (Chapman & Hall/CRC, 2010).

  28. 28.

    & Riemann manifold Langevin and Hamiltonian Monte Carlo methods. J. R. Stat. Soc. Series B Stat. Methodol. 73, 123–214 (2011).

  29. 29.

    , & Sequence to sequence learning with neural networks. In Proc. Advances in Neural Information Processing Systems 27, 3104–3112 (2014).

  30. 30.

    in Maximum Entropy and Bayesian Methods 197–211 (Springer, 1992).

  31. 31.

    & in Encyclopedia of Machine Learning 81–89 (Springer, 2010).

  32. 32.

    , , & (eds). Bayesian Nonparametrics (Cambridge Univ. Press, 2010).

  33. 33.

    & Gaussian Processes for Machine Learning (MIT Press, 2006). This is a classic monograph on Gaussian processes, relating them to kernel methods and other areas of machine learning.

  34. 34.

    & Surpassing human-level face verification performance on LFW with GaussianFace. In Proc. 29th AAAI Conference on Artificial Intelligence (2015).

  35. 35.

    A Bayesian analysis of some nonparametric problems. Ann. Stat. 1, 209–230 (1973).

  36. 36.

    , , & Hierarchical Dirichlet processes. J. Am. Stat. Assoc. 101, 1566–1581 (2006).

  37. 37.

    , , , & Learning systems of concepts with an infinite relational model. In Proc. 21st National Conference on Artificial Intelligence 381–388 (2006).

  38. 38.

    & Bayesian infinite mixture model based clustering of gene expression profiles. Bioinformatics 18, 1194–1206 (2002).

  39. 39.

    , , & Modeling and visualizing uncertainty in gene expression clusters using Dirichlet process mixtures. Trans. Comput. Biol. Bioinform. 6, 615–628 (2009).

  40. 40.

    & The Indian buffet process: an introduction and review. J. Mach. Learn. Res. 12, 1185–1224 (2011). This article introduced a new class of Bayesian non-parametric models for latent feature modelling.

  41. 41.

    , & Learning the structure of deep sparse graphical models. In Proc. 13th International Conference on Artificial Intelligence and Statistics (eds Teh, Y. W. & Titterington, M.) 1–8 (2010).

  42. 42.

    , & Nonparametric latent feature models for link prediction. In Proc. Advances in Neural Information Processing Systems 1276–1284 (2009).

  43. 43.

    , & in Parallel Distributed Processing: Explorations in the Microstructure of Cognition: Foundations 77–109 (MIT Press, 1986).

  44. 44.

    Bayesian Learning for Neural Networks (Springer, 1996). This text derived MCMC-based Bayesian inference in neural networks and drew important links to Gaussian processes.

  45. 45.

    , & Effective Bayesian inference for stochastic programs. In Proc. 14th National Conference on Artificial Intelligence 740–747 (1997).

  46. 46.

    & The Design and Implementation of Probabilistic Programming Languages. Available at (2015).

  47. 47.

    Practical Probabilistic Programming (Manning, 2015).

  48. 48.

    , & in Turing's Legacy (ed. Downey, R.) 195–252 (2014).

  49. 49.

    , , & Markov chain Monte Carlo without likelihoods. Proc. Natl Acad. Sci. USA 100, 15324–15328 (2003).

  50. 50.

    , , & Approximate Bayesian image interpretation using generative probabilistic graphics programs. In Proc. Advances in Neural Information Processing Systems 26 1520–1528 (2013).

  51. 51.

    Model-based machine learning. Phil. Trans. R. Soc. A 371, 20120222 (2013). This article is a very clear tutorial exposition of probabilistic modelling.

  52. 52.

    , , & WinBUGS — a Bayesian modelling framework: concepts, structure, and extensibility. Stat. Comput. 10, 325–337 (2000). This reports an early probabilistic programming framework widely used in statistics.

  53. 53.

    Stan Development Team. Stan Modeling Language Users Guide and Reference Manual, Version 2.5.0. (2014).

  54. 54.

    & AutoBayes: a system for generating data analysis programs from statistical models. J. Funct. Program. 13, 483–508 (2003).

  55. 55.

    , , & Infer.NET 2.4. (Microsoft Research, 2010).

  56. 56.

    , & Lightweight implementations of probabilistic programming languages via transformational compilation. In Proc. International Conference on Artificial Intelligence and Statistics 770–778 (2011).

  57. 57.

    IBAL: a probabilistic rational programming language. In Proc. International Joint Conference on Artificial Intelligence 733–740 (2001).

  58. 58.

    et al. BLOG: probabilistic models with unknown objects. In Proc. 19th International Joint Conference on Artificial Intelligence 1352–1359 (2005).

  59. 59.

    , , , & Church: a language for generative models. In Proc. Uncertainty in Artificial Intelligence 22 23 (2008). This is an influential paper introducing the Turing-complete probabilistic programming language Church.

  60. 60.

    Figaro: An Object-Oriented Probabilistic Programming Language. Tech. Rep. (Charles River Analytics, 2009).

  61. 61.

    , & Venture: a higher-order probabilistic programming platform with programmable inference. Preprint at (2014).

  62. 62.

    , & A new approach to probabilistic programming inference. In Proc. 17th International Conference on Artificial Intelligence and Statistics 1024–1032 (2014).

  63. 63.

    , & SWIFT: Compiled Inference for Probabilistic Programs. Report No. UCB/EECS-2015–12 (Univ. California, Berkeley, 2015).

  64. 64.

    et al. Theano: a CPU and GPU math expression compiler. In Proc. 9th Python in Science Conference (2010).

  65. 65.

    A new method of locating the maximum point of an arbitrary multipeak curve in the presence of noise. J. Basic Eng. 86, 97–106 (1964).

  66. 66.

    , & Efficient global optimization of expensive black-box functions. J. Glob. Optim. 13, 455–492 (1998).

  67. 67.

    , & A tutorial on Bayesian optimization of expensive cost functions, with application to active user modeling and hierarchical reinforcement learning. Preprint at (2010).

  68. 68.

    & Entropy search for information-efficient global optimization. J. Mach. Learn. Res. 13, 1809–1837 (2012).

  69. 69.

    , & Predictive entropy search for efficient global optimization of black-box functions. In Proc. Advances in Neural Information Processing Systems 918–926 (2014).

  70. 70.

    , & Practical Bayesian optimization of machine learning algorithms. In Proc. Advances in Neural Information Processing Systems 2960–2968 (2012).

  71. 71.

    , , & Auto-WEKA: combined selection and hyperparameter optimization of classification algorithms. In Proc. 19th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining 847–855 (2013).

  72. 72.

    Some aspects of the sequential design of experiments. Bull. Amer. Math. Soc. 55, 527–535 (1952).

  73. 73.

    & PILCO: a model-based and data-efficient approach to policy search. In Proc. 28th International Conference on Machine Learning 465–472 (2011).

  74. 74.

    in Encyclopedia of Machine Learning 90–93 (Springer, 2010).

  75. 75.

    in Statistical Decision Theory and Related Topics IV 163–175 (Springer, 1988).

  76. 76.

    Bayes-Hermite quadrature. J. Statist. Plann. Inference 29, 245–260 (1991).

  77. 77.

    & The Mathematical Theory of Communication (Univ. Illinois Press, 1949).

  78. 78.

    Information Theory, Inference, and Learning Algorithms (Cambridge Univ. Press, 2003).

  79. 79.

    , , , & The sequence memoizer. Commun. ACM 54, 91–98 (2011). This article derives a state-of-the-art data compression scheme based on Bayesian nonparametric models.

  80. 80.

    , & Improving PPM with dynamic parameter updates. In Proc. Data Compression Conference (in the press).

  81. 81.

    , , , & Automatic construction and natural-language description of nonparametric regression models. In Proc. 28th AAAI Conference on Artificial Intelligence Preprint at: (2014). Introduces the Automatic Statistician, translating learned probabilistic models into reports about data.

  82. 82.

    , & Exploiting compositionality to explore a large space of model structures. In Proc. Conference on Uncertainty in Artificial Intelligence 306–315 (2012).

  83. 83.

    & Distilling free-form natural laws from experimental data. Science 324, 81–85 (2009).

  84. 84.

    , & GLIMPSE: a knowledge-based front end for statistical analysis. Knowl. Base. Syst. 1, 173–178 (1988).

  85. 85.

    Patterns in statistical strategy. In Artificial Intelligence and Statistics (ed Gale, W. A.) (Addison-Wesley Longman, 1986).

  86. 86.

    et al. Functional genomic hypothesis generation and experimentation by a robot scientist. Nature 427, 247–252 (2004).

  87. 87.

    et al. Bayesian inference with big data: a snapshot from a workshop. ISBA Bulletin 21, (2014).

  88. 88.

    & Task clustering and gating for Bayesian multitask learning. J. Mach. Learn. Res. 4, 83–99 (2003).

  89. 89.

    , , & Collaborative Gaussian processes for preference learning. In Proc. Advances in Neural Information Processing Systems 26 2096–2104 (2012).

  90. 90.

    & Do the Right Thing: Studies in Limited Rationality (MIT Press, 1991).

  91. 91.

    On statistics, computation and scalability. Bernoulli 19, 1378–1390 (2013).

  92. 92.

    , , & Stochastic variational inference. J. Mach. Learn. Res. 14, 1303–1347 (2013).

  93. 93.

    , & Gaussian processes for big data. In Proc. Conference on Uncertainty in Artificial Intelligence 244 (UAI, 2013).

  94. 94.

    , & Austerity in MCMC land: cutting the Metropolis-Hastings budget. In Proc. 31th International Conference on Machine Learning 181–189 (2014).

  95. 95.

    , , & Asynchronous anytime sequential Monte Carlo. In Proc. Advances in Neural Information Processing Systems 27 3410–3418 (2014).

  96. 96.

    & Ockham's Razor and Bayesian Analysis. Am. Sci. 80, 64–72 (1992).

  97. 97.

    & Occam's Razor. In Neural Information Processing Systems 13 (eds Leen, T. K., Dietterich, T. G., & Tresp, V.) 294–300 (2001).

  98. 98.

    A tutorial on hidden Markov models and selected applications in speech recognition. Proc. IEEE 77, 257–286 (1989).

  99. 99.

    et al. Bayesian Data Analysis 3rd edn (Chapman & Hall/CRC, 2013).

  100. 100.

    & Statistical model criticism using kernel two sample tests (2015).

Download references

Acknowledgements

I acknowledge an EPSRC grant EP/I036575/1, the DARPA PPAML programme, a Google Focused Research Award for the Automatic Statistician and support from Microsoft Research. I am also grateful for valuable input from D. Duvenaud, H. Ge, M. W. Hoffman, J. R. Lloyd, A. Ścibior, M. Welling and D. Wolpert.

Author information

Affiliations

  1. Department of Engineering, University of Cambridge, Trumpington Street, Cambridge CB2 1PZ, UK.

    • Zoubin Ghahramani

Authors

  1. Search for Zoubin Ghahramani in:

Competing interests

I serve as an advisor to a number of companies working on machine learning and artificial intelligence.

Corresponding author

Correspondence to Zoubin Ghahramani.

Reprints and permissions information is available at www.nature.com/reprints.

About this article

Publication history

Received

Accepted

Published

DOI

https://doi.org/10.1038/nature14541

Further reading

Comments

By submitting a comment you agree to abide by our Terms and Community Guidelines. If you find something abusive or that does not comply with our terms or guidelines please flag it as inappropriate.