The statistical physics that emerged in the nineteenth century, in large part due to the work of Maxwell, Boltzmann and Gibbs, rested on key equilibrium assumptions — the equipartition of energy, or (equivalently) the ergodicity of system dynamics. Take these assumptions away, and it's hard to get very far — even if out of equilibrium is where most of the real world resides. Yet physicists have, for a century, made slow and gradual progress.

Dating back to the 1930s, Lars Onsager showed how flows out of equilibrium produce entropy, in proportion to the flow, at least near equilibrium. Later, Ilya Prigogine speculated about 'minimum entropy production' as an organizing principle for the steady states of dissipative systems driven away from equilibrium. In the 1990s, the idea of self-organized criticality became a major theme, as did growth and the ubiquitous emergence of fractal structures.

Scientists have taken small steps outward into unknown territory, finding some limited areas of firm ground. We now have a few examples of general relations that hold arbitrarily far away from equilibrium. For example, in the late 1990s, Gavin Crooks, building on the work of Christopher Jarzynski and others, discovered a key relation describing the irreversibility of transitions between microscopic states away from equilibrium. There's a link, that is, between how much more likely something is to happen in the forward (rather than the reverse) direction, and the amount of entropy produced in a system's surroundings during a non-equilibrium transition. If Ω is this entropy, then the ratio of forward to reverse probabilities is just eΩ; forward entropy producing processes are exponentially more likely than reverse, entropy reducing ones.

This was known before, but this new work demonstrated a much greater generality to a wide variety of non-equilibrium processes. A number of applications have followed (mostly dealing with improving and analysing simulations of nano- or biomolecular systems) and many more may be just around the corner. One of particular interest has recently come from physicist Jeremy England at the Massachusetts Institute of Technology in the USA.

Does statistical physics have anything to teach us about the constraints under which replication operates?

A couple of years ago, England began wondering about biological replication and cell division. Look at a bacterial colony in a petri dish and you will see, over any time period, far more examples of cells dividing into two, than of two cells merging spontaneously back into one. The former is so much more likely that the latter is effectively never seen. The biology is, of course, consistent with statistical physics — the growth process is out of equilibrium, irreversible, and produces entropy along the way. But is there anything more to say? Does statistical physics have anything to teach us about the constraints under which replication operates?

England suggests that it does, and that Crooks's relation shows the way. As he has shown (J. Chem. Phys. 139, 121923; 2013), the relationship between the probabilities for forward and reverse transitions between microscopic states carries important implications for macroscopic biological realities. In particular, it leads to a lower bound on the amount of heat produced by a self-replicator during one replication, which England is able to derive in explicit form. The bound depends on the replicator's size, growth rate, internal entropy and durability.

This has some surprising implications. For example, think about maximizing the growth rate of a replicator, which is constrained by three factors: an organism's internal entropy, its durability and how much heat it dissipates into the environment during replication. One consequence, as England states it, is that “exact considerations in statistical physics tell us that a self-replicator's maximum potential fitness is set by how effectively it exploits sources of energy in its environment. The empirical, biological fact that reproductive fitness is intimately linked to efficient metabolism now has a clear and simple basis in physics.”

It's a profound idea, and England goes on to test it in several scenarios. It seems to stand up. He first considers simple replicating RNA molecules. In a recent study, biologists were able to optimize the growth rate of a self-replicating RNA molecule through in vitro evolution. The physical chemistry of these optimized RNA replicators falls within the bounds set out by England's inequality, but only just. Optimization comes close to the putative upper bound, but does not surpass it, as would be expected.

This example reveals something that's a little counter-intuitive. In particular, the RNA polymer turns out to be much more efficient at reproduction than DNA, mainly because it has a much lower durability. That sounds odd at first, as we tend to think of durability as a good thing for evolution. It may well be, for preserving genetic information in harsh conditions, yet it is not good for fast reproduction alone. Being more fragile actually reduces how much entropy needs to be produced in each cycle.

England also shows that the theory applies well to single-celled organisms such as bacteria. Using physical parameters for E. coli, his calculations indicate that the theoretical minimum heat this bacterium could produce in a single replication is roughly one-sixth of what it actually does produce. The bacterium respects the bound, but doesn't operate very close to it. As he points out, however, this isn't unexpected, as the bacterium has to do a lot more to survive than simply maximize its rate of replication. It's subject to a wildly fluctuating environment, and has to support a complex internal machinery capable of letting it adjust to that environment. It can't be optimized to any one set of conditions. And, even so, the bacterium isn't far from the bound — a reduction in the heat produced even by one-fourth would put it close to the edge.

This is a fascinating link between the microscopic physics of irreversible processes and the macroscopic world. It may be going too far afield, but I can't help wonder about potential links to the human world. We also replicate, if not by cell division, and our societies harness and dissipate energy to maintain a low-entropy environment. It's interesting that human societies do exhibit something like a metabolism, with scaling laws for energy used as a function of the size of our economies. As we grow in technology and sophistication, even as we get more efficient in our use of energy, we've always come to use more energy overall, and to dissipate more to our environment.

This seems to fit, at least crudely, with England's picture. Is this also a consequence of a general principle of non-equilibrium thermodynamics? Who knows. I suspect the day we find out may not be too far away.