If complexity arises from simple rules, should we rethink how to do science?
A New Kind of Science
- Stephen Wolfram
This is a spectacular, iconoclastic book in almost every sense that matters — in the scope of its ideas, in its claims about science, in its sheer physical size, and in many other ways too, both good and less good. Given the book's claims and its author, it is bound to attract widespread attention and scrutiny (see, for example, Nature 417, 216–218; 2002).
So let me state the volume's main thesis right at the outset. The core of Wolfram's argument, justifying the title of the book, is that the computer changes everything. Thus the primary vehicle for the scientific study of nature and humankind should now be “simple” programs and their interactions, rather than mathematical equations. The phenomena we see in the world around us should be thought of as the running of myriad simple computer programs, Wolfram argues. And the best way to understand these processes is by modelling them on a computer, not by working out the implications of an idealized mathematical model stemming from a set of equations. That's it. That is the “new” kind of science.
Before giving an example, it is important to understand what Wolfram means by a “simple program”. In scientific jargon, what he means is a cellular automaton. This is simply a region, such as a line or a plane, that is divided into a large number of 'cells' like the squares on a chequerboard, each of which can be in one of several states, often just two (black or white, say). There is a rule that describes how the state of each cell changes from one moment to the next depending on the states of its neighbours. The rule of change is the 'simple program' — and there is a bewildering variety of them, even for cellular automata with just two states. The number of possible rules increases geometrically with the number of states. So once the initial configuration of the cellular automaton and the rule of change is specified, along with a definition of what counts as the 'neighbourhood' of a cell, you simply turn on the computer and let it change the state of each cell in accordance with the rule as time unfolds. The patterns made by the cells as they follow the rule of change is what Wolfram correlates with observed patterns in the real worlds of biology, chemistry, physics, economics, astronomy and all the other activities that concern scientists and humans in general.
To give a flavour of the kind of argument that pervades the entire book, think of the formation of a snowflake. Wolfram says we should start with a grid of hexagonally shaped cells in which one cell is black and the rest are white. Then, using the rule that each cell becomes black when exactly one of the cells adjacent to it was black on the previous step, a pattern emerges after about 30 steps that looks strikingly similar to a real snowflake.
Does this mean that this primitive rule of state transition is one of the rules that Nature uses to construct a real snowflake? Maybe. Or then again, maybe not. This points out a serious difficulty in using the “new kind of science”, because it is easy to show that for any given snowflake pattern there are usually an impossibly large number of very different rules that will all lead to that same pattern. Which one does Nature use? Just one? Some of them? All of them? The only way to single out Nature's rule from the pretenders is to provide additional information. But the book is rather silent on how to do this. Of course, the question of model validation is a standard question in mathematical modelling, and is certainly not confined to the approach Wolfram puts forward here. But his arguments do nothing to advance this particular art.
But this is a quasi-quibble. The fact is that the book ranges over a dazzling array of topics and areas, treating each from the very same 'simple program' point of view. The growth of crystals, fluid flow, animal pigmentation patterns, price movements in finance, cosmological models of the Universe, elementary particles, the relationship between space and time, gravity, visual and auditory perception, randomness, cryptography, consciousness, quantum theory and much, much more all make their appearance — and not just en passant. That's why this phenomenal book is over 1,200 pages long.
Wolfram summarizes these myriad investigations in what he terms the “Principle of Computational Equivalence”. In everyday language, this principle asserts that almost all natural and artificial processes that are not obviously simple correspond to computations that are of equivalent complexity. In short, it doesn't matter how simple or complicated the rules or the initial conditions are for a process, any such process will always correspond to a computation of equivalent difficulty or, as Wolfram puts it, “equivalent sophistication”. In slightly more technical terms, almost every physical and human process that can occur in our Universe corresponds to the unfolding of a program that can be run on a universal Turing machine. Note that this does not rule out other computational architectures, such as quantum computers. But it asserts that any such non-Turing sort of computation does not correspond to any natural process in the Universe we inhabit.
The book argues that this principle is a fundamental law of Nature, with implications that go far beyond other laws such as those of thermodynamics. In fact, says Wolfram, it has “vastly richer implications... than essentially any single collection of laws in science”. This is the distilled essence of this fascinating, frustrating and overwhelmingly hubristic book: almost every process known to humankind corresponds to a computation, and every such computation is equivalent in sophistication to every other.
One of the odder aspects of this strange and wondrous book is that, although it ranges far and wide over absolutely every part of the human and natural landscape, it contains not one single reference! Wolfram's explanation for this truly bizarre fact is that he consulted thousands of books, articles and websites in preparing this volume (and who could really doubt that?), so it would be impossible to prepare a truly scholarly bibliography. Rather, the reader should just read the book's 350 pages of notes, and then search on the web for a “vastly more complete picture of available references than could possibly fit in a book of manageable size”. Well, the book is already of unmanageable size, so why not spend another hundred pages and a few months to finish things off?
On a happier note, all of this material is described in entirely non-technical language, completely accessible to almost any reader. So you'll learn a lot about many things by reading this book. In fact, it is almost like reading a small encyclopedia of science, mathematics, computing and modelling. Seen in that light, perhaps 1,200 pages is not too much after all for a tour of almost every nook and cranny between God, the Universe and just about everything else.
A New Kind of Science is a book that simply cannot be ignored. It makes spectacular claims right from its title page onward that the author makes a herculean effort to deliver on. Whether the book's arguments convince you or not, it will force you to reconsider your notions of what constitutes the practice and content of science. Such a book appears only once every few decades.
About this article
Cite this article
Casti, J. Science is a computer program. Nature 417, 381–382 (2002). https://doi.org/10.1038/417381a