“It is difficult to make predictions, especially about the future,” goes the proverb. A study of the dynamics of chaotic systems in the context of information theory adds a twist to this saying.
The chaos revolution is now more than 50 years old1. Long anticipated by mathematicians, beginning with Henri Poincaré in the 1880s, the field finally emerged on the science stage in the early 1960s following the identification of chaotic behaviour in computer simulations of atmospheric and astronomical systems. Since then, many experimental observations of chaotic dynamics (such as in fluids, electric circuits, lasers and insect populations) and parallel theoretical developments have transformed the field into a fully fledged area of research. The central tenet of chaos is that simple deterministic systems — those in which the past uniquely determines the present and the present pegs down the future — can display behaviour that seems random. Writing in Physics Letters A, James and co-workers2 present a study that adds a layer of subtlety to this statement: they show that measurements on a deterministic system that evolves in time may contain information that is specific to its past, present and future.
In the early days of nonlinear science and chaos, several branches of knowledge from within mathematics, physics and computer science helped to create a terminology that has become the lingua franca of a broad community of scientists from the physical, life and social sciences. Geometry and information theory stand out among these contributors. The first interprets the dynamics of a chaotic system as motion in an abstract space of system states, leading to the characterization of beautiful and elegant swirling trajectories called strange attractors. The second, inspired by the work of physicist Ludwig Boltzmann and mathematician Claude Shannon, focuses on calculations based on the probabilities of specific realizations of a chaotic system. A central finding of the latter approach is that information is produced by chaotic systems as they evolve in time. Strong connections between geometry and information theory have helped to unify the field of chaos (see, for example, ref. 3). James and colleagues' study is based on information theory.
In principle, a chaotic system is as predictable as clockwork, although much less regular. In practice, the famous butterfly effect4 amplifies exponentially into the future any uncertainty about the initial state of a dynamical system. But what James et al. found is deeper. Working within the larger context of how to infer models from data, they selected several examples of one- and two-dimensional chaotic systems for which a measurement can be made as coarse as possible — taking a value of either zero or one — through a technique called symbolic dynamics5. Consequently, records of the dynamics of these systems are represented as strings of zeros and ones. The authors then performed information-based calculations on these strings that allowed them to correlate a present measurement to its past and future. They found that, typically, some of the information measured in the present comes from the past (redundant or predicted) and the rest is newly created. Focusing on the created information, they further found that some of it (ephemeral) does not carry into the future — in other words, it is readily forgotten, with the rest (bound) being remembered and carried into the future. This fine-graining of information sheds new light on how chaotic processes work.
To illustrate these concepts, consider a real-life chaotic system, such as an electric circuit, for which coarse measurements can be made but the true states of which are inaccessible (Fig. 1). In this example, any sequence with two consecutive zeros cannot happen. Without performing any calculations, some manifestations of the types of information identified in James and co-workers' study can be gleaned. A sequence '01' is an example of redundant information, because the zero always implies a one. A measurement of '1' can be preceded by either '0' or '1'; this exemplifies ephemeral information, because what came before it becomes irrelevant and is forgotten. Finally, a measurement of '0' carries bound information because the system remembers and evolves to '1'.
James and colleagues' results show how the past and future of an evolving chaotic system become intertwined with its present. This feature may be at the heart of one of the most enigmatic of physical principles: the second law of thermodynamics, which states that the entropy of an isolated system never decreases with time. The statistical, irreversible character of this law is at odds with the underlying deterministic and reversible dynamics of such isolated systems at the microscopic level6,7. The idea of applying the information-based methods presented here to thermodynamic systems, such as collections of gas molecules, is promising. Considering the entropy of such a collection as a property of its state might lead to insight into the 'arrow of time' in the second law, especially because, as James and co-workers show, chaos both forgets and remembers.