Integrated information theory: from consciousness to its physical substrate

Journal name:
Nature Reviews Neuroscience
Volume:
17,
Pages:
450–461
Year published:
DOI:
doi:10.1038/nrn.2016.44
Published online

Abstract

In this Opinion article, we discuss how integrated information theory accounts for several aspects of the relationship between consciousness and the brain. Integrated information theory starts from the essential properties of phenomenal experience, from which it derives the requirements for the physical substrate of consciousness. It argues that the physical substrate of consciousness must be a maximum of intrinsic cause–effect power and provides a means to determine, in principle, the quality and quantity of experience. The theory leads to some counterintuitive predictions and can be used to develop new tools for assessing consciousness in non-communicative patients.

At a glance

Figures

  1. An experience is a conceptual structure.
    Figure 1: An experience is a conceptual structure.

    According to integrated information theory (IIT), a particular experience (illustrated here from the point of view of the subject) is identical to a conceptual structure specified by a physical substrate. The true physical substrate of the depicted experience (seeing one's hands on the piano) and the associated conceptual structure are highly complex. To allow a complete analysis of conceptual structures, the physical substrate illustrated here was chosen to be extremely simple1, 2: four logic gates (labelled A, B, C and D, where A is a Majority (MAJ) gate, B is an OR gate, and C and D are AND gates; the straight arrows indicate connections among the logic gates, the curved arrows indicate self-connections) are shown in a particular state (ON or OFF). The analysis of this system, performed according to the postulates of IIT, identifies a conceptual structure supported by a complex constituted of the elements A, B and C in their current ON states. The borders of the complex, which include elements A, B, and C but exclude element D, are indicated by the green circle. According to IIT, such a complex would be a physical substrate of consciousness (Supplementary information S1 (figure)). The conceptual structure is represented as a set of stars and, equivalently, as a set of histograms. The green circle represents the fact that experience is definite (it has borders). Each histogram illustrates the cause–effect repertoire of a concept: how a particular mechanism constrains the probability of past and future states of its maximally irreducible purview within the complex ABC. The bins on the horizontal axis at the bottom of the histograms represent the 16-dimensional cause–effect space of the complex — all its eight possible past states (p; in blue) and eight possible future states (f; in red; ON is 1 and OFF is 0). The vertical axis represents the probability of each state (for consistency, the probability values shown are over the states of the entire complex and not just over the subset of elements constituting the purview). In this example, five of seven possible concepts exist, specified by the mechanisms A, B, C, AB, AC (all with Φmax>0) in their current state (which are labelled as Ac, Bc, etc.). The subsets BC and ABC do not specify any concept because their cause–effect repertoire is reducible by partitions (Φmax=0). In the middle, the 16-dimensional cause–effect space of the complex is represented as a circle, where each of the 16 axes corresponds to one of the eight possible past (p; blue arrows) and eight possible future states (f; red arrows) of the complex, and the position along the axis represents the probability of that state. Each concept is depicted as a star, the position of which in cause–effect space represents how the concept specifies the probability of past and future states of the complex, and the size of which measures how irreducible the concept is (Φmax). Relations between two concepts (overlaps in their purviews) are represented as lines between the stars. The fundamental identity postulated by IIT claims that the set of concepts and their relations that compose the conceptual structure are identical to the quality of the experience. This is how the experience feels — what it is like to be the complex ABC in its current state 111. The intrinsic irreducibility of the entire conceptual structure (Φmax, a non-negative number) reflects how much consciousness there is (the quantity of the experience). The irreducibility of each concept (Φmax) reflects how much each phenomenal distinction exists within the experience. Different experiences correspond to different conceptual structures.

  2. Identifying the elements, timescale and states of the physical substrate of consciousness (PSC) from first principles.
    Figure 2: Identifying the elements, timescale and states of the physical substrate of consciousness (PSC) from first principles.

    It is possible to determine maxima of cause–effect power within the central nervous system by perturbing and observing neural elements at various micro- and macro-levels18. High cause–effect power is reflected in deterministic responses and low cause–effect power is reflected in responses that vary randomly across trials. a | To identify the spatial grain of the elements of the PSC supporting consciousness, a schematic example shows how optogenetic perturbation and unit recording could be applied to a subset of neurons (here, 3 out of 36 neurons) to establish maxima of cause–effect power. For each of three trials, the left panel shows the effects of the perturbation on the entire system at the micro-level. Grey neurons are unaffected, blue neurons decrease their firing rates, red neurons increase their firing rates and purple neurons respond with burst firing. The right-hand panel shows the effects of the perturbation at the macro-level after coarse-graining of the 36 neurons into nine groups of four cells each. Macro-states are defined according to the rule that if ≥50% of the neurons in the group are in a given micro-state (such as low firing, high firing or bursting), then the group is considered to be in that state at the macro-level. In this example, the macro-level (groups of neurons) has higher cause–effect power than the micro-level (single neurons), because the response is deterministic at the macro-level (as evidenced by the consistent colour scheme), whereas there are variations between trials at the micro-level (inconsistent colours). b | To identify the temporal grain of neuronal activity supporting consciousness, a possible experimental setup would be one in which one neuron (the top trace) is optogenetically excited while recording from other neurons (labelled N1–N4) across three trials, shown in the upper panel at the 10 ms timescale (micro-scale). Grey shading indicates no effects on neuron firing in the 10 ms following the stimulation compared with the 10 ms before the stimulation, blue shading indicates decreased firing and red shading indicates increased firing. The lower panel shows the same data after temporal coarse-graining over 100 ms intervals. Macro-states are defined according to the rule that if a neuron increases (or decreases) its firing rate by >50% within 100 ms post-stimulus compared with the baseline, the macro state is considered to be high (or low) firing. In this example, the macro-level (100 ms intervals) has higher cause–effect power (more deterministic responses) than the micro-level (10 ms intervals). c | To identify the neural states that support consciousness, optogenetic perturbations could be used to drive one neuron to fire either at low frequency, high tonic frequency or bursting (top trace) resulting in spectral peaks at 2 Hz (green), 50 Hz (red) and 150 Hz (yellow) for neurons N1–N4 (data are shown as a firing rate histogram). For each trial, the upper panel shows the responses of the other four neurons to each stimulation frequency at the micro-scale level in the spectral domain (micro-bins, only a few of which are represented). The coloured bars indicate coincidence, within a micro-bin, between the frequency of stimulation and the spectral peak of the responses. The lower panel of each trial shows the effect of the perturbation at the corresponding macro-level after spectral coarse-graining. Macro-states map into micro-states as indicated below the frequency bins. Here, spectral coarse-graining (binning firing rates into three levels, low, high and burst firing) results in higher cause–effect power (responses that are more deterministic) than at the micro-level.

  3. Identifying the physical substrate of consciousness (PSC) from first principles.
    Figure 3: Identifying the physical substrate of consciousness (PSC) from first principles.

    The complex of neural elements that constitutes the PSC can be identified by searching for maxima of intrinsic cause–effect power. a | For example, assume that the elements, timescale and states at which intrinsic cause–effect power reaches a maximum have been identified using optogenetic and unit recording tools (Fig. 2). Here, the elements are groups of neurons, the timescale is over 100 ms and there are three states (low, high and burst firing). b | In a healthy, awake participant, the set of neural elements specifying the conceptual structure with the highest Φmax is assumed, based on current evidence, to be a complex of neuronal groups distributed over the posterior cortex and portions of the anterior cortex5. Empirical studies can, in principle, establish whether the full neural correlates of consciousness5 correspond to the maximum of intrinsic cause–effect power, thereby corroborating or falsifying a key prediction of integrated information theory. c | The boundaries of the PSC (green line) may change after cortical lesions, such as those causing absolute achromatopsia, resulting in a smaller PSC. d | The PSC boundaries may also move as a result of changes in excitability and effective connectivity, as might occur during pure thought that is devoid of sensory content. e | The PSC could also split into two large local maxima of cause–effect power (represented here by green and blue boundaries) as a result of anatomical disconnections, such as in split-brain patients, in which instance each hemisphere would have its own consciousness. f | The PSC may also split as a result of functional disconnections, which may occur in some psychiatric disorders and perhaps under certain dual-task conditions — for example while driving and talking at the same time. g | The coexistence of a large major complex with one or more minor complexes that may support sophisticated, seemingly unconscious performance could be a common occurrence in everyday life.

  4. Phenomenal content and access content.
    Figure 4: Phenomenal content and access content.

    The content of an experience is much larger than what can be reported by a subject at any point in time. The left-hand panel illustrates the Sperling task41, which involves the brief presentation of a three by four array of letters on a screen, and a particular row being cued by a tone. Out of the 12 letters shown on the display, participants correctly report only three or four letters — the letters cued by the tone — reflecting limited access. The top middle panel illustrates a highly simplified conceptual structure that corresponds to seeing the Sperling display, using the same conventions as outlined in Fig. 1. The myriad of positive and negative, first- and high-order, low- and high invariance concepts (represented by stars) that specify the content of this particular experience (seeing the Sperling display and having to report which letters were seen) make it what it is and different from countless other experiences (rich phenomenal content). The bottom panel schematically illustrates the physical substrate of consciousness (PSC) that might correspond to this particular conceptual structure (its boundary is represented by a green line). The PSC consists of neuronal groups that can be in a low firing state, a high firing state or a bursting state. Alone and in combination, these neuronal groups specify all the concepts that compose the conceptual structure. Stars that are linked to the PSC by grey dashed lines represent a small subset of these concepts. The PSC is synaptically connected to neurons in Broca's area by means of a limited capacity channel (dashed black arrow) that is dynamically gated by top-down connections (shown as solid black arrows) originating in the prefrontal cortex to carry out the instruction (that is, to report the observed letters 'OSA').

References

  1. Oizumi, M., Albantakis, L. & Tononi, G. From the phenomenology to the mechanisms of consciousness: integrated information theory 3.0. PLoS Comput. Biol. 10, e1003588 (2014).
  2. Tononi, G. The integrated information theory of consciousness: an updated account. Arch. Ital. Biol. 150, 5690 (2012).
  3. Tononi, G. Integrated information theory. Scholarpedia http://dx.doi.org/10.4249/scholarpedia.4164 (2015).
  4. Posner, J. B., Saper, C. B., Schiff, N. D. & Plum, F. Diagnosis of Stupor and Coma (Oxford Univ. Press, 2007).
  5. Koch, C., Massimini, M., Boly, M. & Tononi, G. The neural correlates of consciousness: progress and problems. Nat. Rev. Neurosci. 17, 307321 (2016).
  6. Boly, M. et al. Consciousness in humans and non-human animals: recent advances and future directions. Front. Psychol. 4, 625 (2013).
  7. Lemon, R. N. & Edgley, S. A. Life without a cerebellum. Brain 133, 652654 (2010).
  8. Yu, F., Jiang, Q. J., Sun, X. Y. & Zhang, R. W. A new case of complete primary cerebellar agenesis: clinical and imaging findings in a living patient. Brain 138, e353 (2015).
  9. Tononi, G. & Koch, C. Consciousness: here, there, and everywhere? Phil. Trans. R. Soc. B 370, 20140167 (2015).
  10. Chalmers, D. J. Facing up to the problem of consciousness. J. Conscious. Studies 2, 200219 (1995).
  11. Tononi, G. An information integration theory of consciousness. BMC Neurosci. 5, 42 (2004).
  12. Tononi, G. Consciousness as integrated information: a provisional manifesto. Biol. Bull. 215, 216242 (2008).
  13. Descartes, R. Discourse on Method and Meditations on First Philosophy (Hackett, 1998).
  14. Pöppel, E. Mindworks: Time and Conscious Experience (Harcourt Brace Jovanovich, 1988).
  15. Holcombe, A. O. Seeing slow and seeing fast: two limits on perception. Trends Cogn. Sci. 13, 216221 (2009).
  16. Bachmann, T. Microgenetic Approach to the Conscious Mind (John Benjamins, 2000).
  17. Kim, J. Multiple realization and the metaphysics of reduction. Philos. Phenomenol. Res. 52, 126 (1992).
  18. Hoel, E. P., Albantakis, L. & Tononi, G. Quantifying causal emergence shows that macro can beat micro. Proc. Natl Acad. Sci. USA 110, 1979019795 (2013).
  19. Alivisatos, A.P. et al. The brain activity map project and the challenge of functional connectomics. Neuron 74, 970974 (2012).
  20. Buzsáki, G. Neural syntax: cell assemblies, synapsembles, and readers. Neuron 68, 362385 (2010).
  21. Li, C. Y., Poo, M. M. & Dan, Y. Burst spiking of a single cortical neuron modifies global brain state. Science 324, 643646 (2009).
  22. London, M., Roth, A., Beeren, L., Häusser, M. & Latham, P. E. Sensitivity to perturbations in vivo implies high noise and suggests rate coding in cortex. Nature 466, 123127 (2010).
  23. Boly, M. et al. Stimulus set meaningfulness and neurophysiological differentiation: a functional magnetic resonance imaging study. PLoS ONE 10, e0125337 (2015).
  24. Boly, M. et al. Brain connectivity in disorders of consciousness. Brain Connect. 2, 110 (2012).
  25. Seth, A. K., Barrett, A. B. & Barnett, L. Causal density and integrated information as measures of conscious level. Philos. Trans. A Math. Phys. Eng. Sci. 369, 37483767 (2011).
  26. Deco, G., Hagmann, P., Hudetz, A. G. & Tononi, G. Modeling resting-state functional networks when the cortex falls asleep: local and global changes. Cereb. Cortex 24, 31803194 (2014).
  27. von Arx, S. W., Muri, R. M., Heinemann, D., Hess, C. W. & Nyffeler, T. Anosognosia for cerebral achromatopsia — a longitudinal case study. Neuropsychologia 48, 970977 (2010).
  28. Goldberg, I. I., Harel, M. & Malach, R. When the brain loses its self: prefrontal inactivation during sensorimotor processing. Neuron 50, 329339 (2006).
  29. Steriade, M., Timofeev, I. & Grenier, F. Natural waking and sleep states: a view from inside neocortical neurons. J. Neurophysiol. 85, 19691985 (2001).
  30. Nir, Y. et al. Regional slow waves and spindles in human sleep. Neuron 70, 153169 (2011).
  31. Siclari, F., LaRocque, J. J., Bernardi, G., Postle, B. R. & Tononi, G. The neural correlates of consciousness in sleep: a no-task, within-state paradigm. BioRXiv http://dx.doi.org/10.1101/012443 (2014).
  32. Sperry, R. W. in Neuroscience 3rd Study Program (eds Schmitt, F. O. & Worden, F. G.) 519 (MIT Press, 1974).
  33. Gazzaniga, M. S. Forty-five years of split-brain research and still going strong. Nat. Rev. Neurosci. 6, 653659 (2005).
  34. Berlin, H. A. The neural basis of the dynamic unconscious. Neuropsychoanalysis 13, 168 (2011).
  35. Mudrik, L., Breska, A., Lamy, D. & Deouell, L. Y. Integration without awareness: expanding the limits of unconscious processing. Psychol. Sci. 22, 764770 (2011).
  36. Mudrik, L., Faivre, N. & Koch, C. Information integration without awareness. Trends Cogn. Sci. 18, 488496 (2014).
  37. Lamme, V. A. & Roelfsema, P. R. The distinct modes of vision offered by feedforward and recurrent processing. Trends Neurosci. 23, 571579 (2000).
  38. Harris, K. D. & Shepherd, G. M. The neocortical circuit: themes and variations. Nat. Neurosci. 18, 170181 (2015).
  39. Miller, G. A. The magical number seven, plus or minus two: some limits on our capacity for processing information. Psychol. Rev. 63, 8197 (1956).
  40. Norretranders, T. The User Illusion: Cutting Consciousness Down to Size (Viking Penguin, 1991).
  41. Sperling, G. The information available in brief visual presentations. Psychol. Monogr. 74, 129 (1960).
  42. Cohen, M. A. & Dennett, D. C. Consciousness cannot be separated from function. Trends Cogn. Sci. 15, 358364 (2011).
  43. Cohen, M. A. & Dennett, D. C. Response to Fahrenfort and Lamme: defining reportability, accessibility and sufficiency in conscious awareness. Trends Cogn. Sci. 16, 139140 (2012).
  44. O'Regan, J. K., Rensink, R. A. & Clark, J. J. Change-blindness as a result of 'mudsplashes'. Nature 398, 3434 (1999).
  45. Dehaene, S. Consciousness and the Brain: Deciphering How the Brain Codes our Thoughts (Penguin, 2014).
  46. Kouider, S., de Gardelle, V., Sackur, J. & Dupoux, E. How rich is consciousness? The partial awareness hypothesis. Trends Cogn. Sci. 14, 301307 (2010).
  47. Block, N. On a confusion about a function of consciousness. Behav. Brain Sci. 18, 227287 (1995).
  48. Block, N. Perceptual consciousness overflows cognitive access. Trends Cogn. Sci. 15, 567575 (2011).
  49. Lamme, V. A. How neuroscience will change our view on consciousness. Cogn. Neurosci. 1, 204220 (2010).
  50. Bronfman, Z. Z., Brezis, N., Jacobson, H. & Usher, M. We see more than we can report: “cost free” color phenomenality outside focal attention. Psychol. Sci. 25, 13941403 (2014).
  51. Wolfe, J. in Fleeting Memories (ed. Coltheart, V.) 7194 (MIT Press, 2000).
  52. Felleman, D. J. & Van Essen, D. C. Distributed hierarchical processing in the primate cerebral cortex. Cereb. Cortex 1, 147 (1991).
  53. Riesenhuber, M. & Poggio, T. Hierarchical models of object recognition in cortex. Nat. Neurosci. 2, 10191025 (1999).
  54. Franzius, M., Sprekeler, H. & Wiskott, L. Slowness and sparseness lead to place, head-direction, and spatial-view cells. PLoS Comput. Biol. 3, e166 (2007).
  55. Spratling, M. W. Learning posture invariant spatial representations through temporal correlations. IEEE Trans. Autonom. Ment. Dev. 1, 253263 (2009).
  56. Treisman, A. The binding problem. Curr. Opin. Neurobiol. 6, 171178 (1996).
  57. Baddeley, A. D. Working Memory (Clarendon Press, 1986).
  58. Herculano-Houzel, S. The remarkable, yet not extraordinary, human brain as a scaled-up primate brain and its associated cost. Proc. Natl Acad. Sci. USA 109 (Suppl. 1), 1066110668 (2012).
  59. Jain, S. K. et al. Bilateral large traumatic basal ganglia haemorrhage in a conscious adult: a rare case report. Brain Inj. 27, 500503 (2013).
  60. Straussberg, R. et al. Familial infantile bilateral striatal necrosis: clinical features and response to biotin treatment. Neurology 59, 983989 (2002).
  61. Caparros-Lefebvre, D., Destee, A. & Petit, H. Late onset familial dystonia: could mitochondrial deficits induce a diffuse lesioning process of the whole basal ganglia system? J. Neurol. Neurosurg. Psychiatry 63, 196203 (1997).
  62. Pigorini, A. et al. Bistability breaks-off deterministic responses to intracortical stimulation during non-REM sleep. Neuroimage 112, 105113 (2015).
  63. Blumenfeld, H. Impaired consciousness in epilepsy. Lancet Neurol. 11, 814826 (2012).
  64. Friston, K. The free-energy principle: a unified brain theory? Nat. Rev. Neurosci. 11, 127138 (2010).
  65. Edlund, J. A. et al. Integrated information increases with fitness in the evolution of animats. PLoS Comput. Biol. 7, e1002236 (2011).
  66. Albantakis, L., Hintze, A., Koch, C., Adami, C. & Tononi, G. Evolution of integrated causal structures in animats exposed to environments of increasing complexity. PLoS Comput. Biol. 10, e1003966 (2014).
  67. Massimini, M. et al. Breakdown of cortical effective connectivity during sleep. Science 309, 22282232 (2005).
  68. Casali, A. G. et al. A theoretically based index of consciousness independent of sensory processing and behavior. Sci. Transl Med. 5, 198ra105 (2013).
  69. Massimini, M. et al. Cortical reactivity and effective connectivity during REM sleep in humans. Cogn. Neurosci. 1, 176183 (2010).
  70. Sarasso, S. et al. Consciousness and complexity during unresponsiveness induced by propofol, xenon, and ketamine. Curr. Biol. 25, 30993105 (2015).
  71. Barrett, A. B. & Seth, A. K. Practical measures of integrated information for time-series data. PLoS Comput. Biol. 7, e1001052 (2011).
  72. Oizumi, M., Amari, S., Yanagawa, T., Fujii, N. & Tsuchiya, N. Measuring integrated information from the decoding perspective. PLoS Comput Biol 12, e1004654 (2015).
  73. Hudetz, A. G., Liu, X. & Pillay, S. Dynamic repertoire of intrinsic brain states is reduced in propofol-induced unconsciousness. Brain Connect. 5, 1022 (2015).
  74. Barttfeld, P. et al. Signature of consciousness in the dynamics of resting-state brain activity. Proc. Natl Acad. Sci. USA 112, 887892 (2015).
  75. Tagliazucchi, E. et al. Large-scale signatures of unconsciousness are consistent with a departure from critical dynamics. J. R. Soc. Interface 13, 20151027 (2016).
  76. Sullivan, P. R. Contentless consciousness and information-processing theories of mind. Philos. Psychiatry Psychol. 2, 5159 (1995).
  77. Baars, B. A. Cognitive Theory of Consciousness (Cambridge Univ. Press, 1988).
  78. Dehaene, S. & Changeux, J.-P. Experimental and theoretical approaches to conscious processing. Neuron 70, 200227 (2011).
  79. Steriade, M. The corticothalamic system in sleep. Front. Biosci. 8, d878-99 (2003).
  80. Searle, J. Can information theory explain consciousness? New York Review of Books (10 Jan 2013).

Download references

Author information

Affiliations

  1. Department of Psychiatry, University of Wisconsin, 6001 Research Park Boulevard, Madison, Wisconsin 53719, USA.

    • Giulio Tononi
  2. Department of Psychiatry, University of Wisconsin, 6001 Research Park Boulevard, Madison, Wisconsin 53719 USA; and at the Department of Neurology, University of Wisconsin, 1685 Highland Avenue, Madison, Wisconsin 53705, USA.

    • Melanie Boly
  3. Department of Biomedical and Clinical Sciences 'Luigi Sacco', University of Milan, Via G.B. Grassi 74, Milan 20157, Italy; and at the Instituto Di Ricovero e Cura a Carattere Scientifico, Fondazione Don Carlo Gnocchi, Via A. Capecelatro 66, Milan 20148, Italy.

    • Marcello Massimini
  4. Allen Institute for Brain Science, 615 Westlake Ave N, Seattle, Washington 98109, USA.

    • Christof Koch

Competing interests statement

The authors declare no competing interests.

Corresponding author

Correspondence to:

Author details

  • Giulio Tononi

    Giulio Tononi is a neuroscientist and psychiatrist at the Department of Psychiatry at the University of Wisconsin, Madison, USA, where he works on the nature of consciousness and the functions of sleep.

  • Melanie Boly

    Melanie Boly is a neurologist and neuroscientist at the University of Wisconsin, Madison, USA. She uses neuroimaging and theoretical approaches to understand the neural substrate of experience and its alterations in sleep, anaesthesia and disorders of consciousness.

  • Marcello Massimini

    Marcello Massimini is Associate Professor of Neurophysiology at the University of Milan, Italy. He was trained as a medical doctor and devotes his research to understanding the neuronal mechanisms of loss and recovery of consciousness in sleep, anaesthesia and in patients with brain injuries.

  • Christof Koch

    Christof Koch is the President and Chief Scientific Officer of the Allen Institute for Brain Science in Seattle, Washington, USA. He trained as a physicist. On a quest to understand the physical roots of consciousness before his brain stops functioning, he published his first paper on the neural correlates of consciousness with Francis Crick a quarter of a century ago.

Supplementary information

PDF files

  1. Supplementary information S1 (figure) (284 KB)

    Axioms and postulates of IIT.

  2. Supplementary information S2 (box) (83 KB)

    IIT Pseudocode

  3. Supplementary information S3 (figure) (189 KB)

    Integrated information, neuroanatomy and neurophysiology.

  4. Supplementary information S4 (box) (785 KB)

    Neuronal bistability impairs information integration during slow wave sleep

  5. Supplementary information S5 (box) (246 KB)

    IIT and other theories of consciousness

Additional data