Information theory

Information theory is the mathematical quantification of information with applications ranging from data telecommunications to evolution, brain science or cosmology. Entropy measures information in numbers of bits and is lower in the flipping of a coin (two possible outcomes) than in the rolling of a die (six possible outcomes).

Latest Research and Reviews

News and Comment

  • News & Views |

    Approaches that abandon traditional speech categories offer promise for developing statistical descriptions that encapsulate how speech conveys information. Grandparents would be among the beneficiaries.

    • Michael S. Lewicki
    Nature 466, 821-822