Thank you for visiting nature.com. You are using a browser version with limited support for CSS. To obtain
the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in
Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles
and JavaScript.
Information theory is the mathematical quantification of information with applications ranging from data telecommunications to evolution, brain science or cosmology. Entropy measures information in numbers of bits and is lower in the flipping of a coin (two possible outcomes) than in the rolling of a die (six possible outcomes).
Understanding how cells discriminate between stimuli is an ongoing challenge. Here, the authors propose a mathematical framework for inferring the mutual information encoded in temporal signaling dynamics and use it to study how information is transmitted over time in response to different stimuli in NFκB, MAPK and p53 signaling pathways.
Sarkar et al. present a method called sparse estimation of mutual information landscapes (SEMIL) that quantifies information transmission landscapes through cellular biochemical reaction networks across a space of input distributions using single-cell gene expression data. This study suggests that mutual information landscapes can be used as a performance metric for biochemical reaction networks.
Approaches that abandon traditional speech categories offer promise for developing statistical descriptions that encapsulate how speech conveys information. Grandparents would be among the beneficiaries.