Noise in biochemical processes can compromise precision in cellular functions. An analysis involving information theory suggests that there is a strict limit to how far noise can be suppressed by feedback.
Simple phenomena provide theoreticians with fertile ground for developing fundamental mathematical formalisms, which are then approximated to provide insight into complex phenomena. Such complexity is a characteristic of cell biochemistry. But it is quite a challenge to simplify a cellular network so as to formulate general conclusions about its operation that are, at the same time, experimentally verifiable.
This challenge is tackled by Lestas et al. on page 174 of this issue1. They provide a theoretical analysis that establishes the limits to which feedback control can suppress noise in a molecular system. Here, noise is the random fluctuation in molecule abundance. Such fluctuations are inevitable when the numbers of a molecule present are low2. For example, many plasmids (self-replicating DNA elements) and messenger RNA molecules have mean copy numbers of around one per cell. The smallest possible increase of copy number, from one to two, corresponds to a dramatic 100% change in concentration. Thus, the feedback control of low concentrations is notoriously difficult, and easily leads to overshooting and random oscillations. Furthermore, slight variations in the feedback control scheme result in markedly different efficiencies of noise suppression; noise can even be amplified3. These ambiguities have hampered the understanding and design of efficient noise control.
Lestas et al.1 broke their system down into three parts: the target and mediator molecules, and a feedback circuitry of arbitrary complexity. Although the theory is general, the focus is on target molecules that are present at low copy numbers owing to their slow production and/or fast decay rates. The target enhances the production of a mediator, which in turn triggers signalling in a circuitry that feeds back on the target. An analogous breakdown of a system has been successful in other types of system analysis: well-defined input–output modules are separated from the rest of the signalling circuitry, which has to meet only general conditions. Such an approach has been used to analyse the number of distinct, stable concentration values that a signalling circuitry can induce; this is important for assessing how robustly cellular memory can function4.
The stochastic control problem was then recast by Lestas et al.1 in the language of information theory. Information theory has been frequently applied to find fundamental limits in signal processing, for example limits in reliable data communication. Within this framework, it becomes evident why the efficiency of noise suppression is limited. If information processing is personified by the gods of Greek mythology, Tyche and Hermes would represent the randomly fluctuating target and mediator molecules, respectively; Athena would devise the ingenious control schemes.
As the wise Athena constrains Tyche, the capricious goddess of chance, Tyche's vigour will wane. In turn, Hermes, the faithful messenger, delivers less and less information about her to Athena, so that the wise goddess will lack essential information with which to adjust the control schemes. Back in the cellular world, this means that, as the feedback circuit starts to reduce copy-number fluctuations of the target molecule, the information available about them becomes limited because these fluctuations generate the very signal that is needed for the feedback control to suppress the fluctuations.
The theory yields surprisingly simple, experimentally verifiable solutions, revealing that the ratio of the mean number of mediator molecules to that of the target molecules is a critical factor in setting the limit of noise suppression. When needed, noise suppression is a costly enterprise: a tenfold reduction in noise is possible only if 10,000 mediator molecules are produced for each target molecule. The cost of reducing variation is large in other realms of biology, as well. For example, the maintenance of constant body temperature in mammals (homeothermy) means that they consume around ten times more energy than do reptiles, which have varying body temperatures. Why spend this extra energy? After all, both types of animal thrive on Earth. The reason is that, because of homeothermy, mammals are not limited to being active only within narrow ranges of environmental temperature, a factor that is thought to have favoured their global expansion.
It remains to be seen, however, how widely such expensive negative-feedback loops are used in gene-regulatory networks to suppress fast fluctuations in mediator-molecule abundance. So far, several plasmids have been identified that produce mediator molecules at an amazing rate, which can indeed contribute to efficient control of copy numbers.
A second insight gained by Lestas et al.1 counters the idea that optimal function is attained when a network evolves to a sufficient complexity. On the contrary, increasing the number of components in the signalling circuitry introduces more opportunities for external noise to compromise transmission in the feedback circuitry — which, in our mythical analogy, will leave Athena starved of information. Taking these conclusions to the extreme, it may not be surprising that some mechanisms for plasmid copy-number control lack indirect signalling circuitries altogether. Such control can then be achieved by permitting replication when there is only a single free copy of the plasmid, and by inhibiting replication when two plasmids bind to each other (pairing of the DNA sequences that initiate replication prevents the replication of both plasmids).
Plasmid fluctuations occur on a fast timescale. When fluctuations are slow, the uncertainty in the signal is reduced — Athena can collect more information and exert more precise control. Even without feedback, biochemical reaction networks of appropriate structure can exert absolute concentration control over some of the network components5. When evolutionary gene duplication or slow environmental changes alter gene expression, the concentration of the gene product will be kept constant by such networks. Several biochemical and genetic networks may combine fluctuations at fast and at very slow timescales, which in turn must be examined by careful experiments, because networks have very different buffering capacities at fast and slow timescales1,5,6. This may also explain why even a simple negative-feedback loop in a genetic circuit can efficiently reduce concentration heterogeneities in a cell population7.
It will be interesting to compare the optimal network structures suited to suppress either slowly varying, inherited population heterogeneities, or fast fluctuations. Breaking a system down into a few reaction steps to be examined, while confining the properties of the rest of the signalling network in a general way, should lead to further insights into the operation of cellular networks. Moreover, knowing the limits of system performance will aid progress in biological engineering.
Lestas, I., Vinnicombe, G. & Paulsson, J. Nature 467, 174–178 (2010).
Eldar, A. & Elowitz, M. B. Nature 467, 167–173 (2010).
Marquez-Lago, T. T. & Stelling, J. Biophys. J. 98, 1742–1750 (2010).
Angeli, D., Ferrell, J. E. Jr & Sontag, E. D. Proc. Natl Acad. Sci. USA 101, 1822–1827 (2004).
Shinar, G. & Feinberg, M. Science 327, 1389–1391 (2010).
Austin, D. W. et al. Nature 439, 608–611 (2006).
Nevozhay, D., Adams, R. M., Murphy, K. F., Josić, K. & Balázsi, G. Proc. Natl Acad. Sci. USA 106, 5123–5128 (2009).
See Review, page 167 .