Why Most Things Fail: Evolution, Extinction and Economics

  • Paul Ormerod
Faber & Faber: 2005. 272 pp. £12.99 0571220126 | ISBN: 0-571-22012-6

In An Enquiry Concerning Political Justice (1793), social philosopher William Godwin argued that it should be possible to extend human life indefinitely through “the sway of mind over matter”. His recipe for immortality included the cultivation of benevolent and optimistic attitudes. Alchemists also sought to deliver the secrets of longevity but, frustrated by their failure, and refusing to be thwarted by their own finitude, humans sought other ways of imprinting themselves on the future. The pharaohs built pyramids, and the Mongolian tyrant Genghis Khan conquered vast tracts of Asia and Europe, reputedly siring so many offspring in the process that as many as 1 in 200 men may be descended from him.

But if flesh could not be immortalized, why not imitate it, creating artificial beings capable of indefinite existence? In Leviathan (1651), the philosopher Thomas Hobbes suggested that the nation state is a type of ‘artificial man’. Ancient Greece and Rome had already spawned their own type of artificial being: the corporate entity. Greek etairia corresponded closely to modern corporations, and Roman collegia enabled property to be held in common. Medieval European business enterprises, such as the early Italian banking firms, were the forerunners of modern multinational companies.

The pivotal event in the evolution of modern corporations came in 1811, when New York state filed legislation enshrining the principle of limited shareholder liability; until this point, investors holding even a single share in a company were liable for unlimited losses. But with the adoption of this legislation and the resulting injection of low-risk capital, new companies flourished and New York City became the world's premier financial centre. Other nations followed New York's lead, giving rise to modern stock markets and the global economy.

Boom and bust: the New York Stock Exchange has witnessed the demise of countless companies. Credit: ACE STOCK/ALAMY

In his interesting and entertaining book Why Most Things Fail, Paul Ormerod explains why this experiment with artificial immortality was fatally flawed. Like natural species, companies walk a fine line between existence and extinction. The first biological species on Earth emerged some 3.45 billion years ago. Organisms remained pretty simple until complex multicellular life erupted 550 million years ago during the Cambrian explosion. This was when all modern phyla were formed, as well as alternative animal designs that left no descendants. Ormerod suggests that this burst of biological creativity was mirrored in the ‘Edwardian explosion’ of 1880–1910, which witnessed the emergence of the first truly multinational corporate entities. By the start of the twentieth century, for example, US Steel employed more than 20,000 people and in 1917 had assets in excess of $2.4 million ($400 billion in today's terms). But like most of the Cambrian phyla, many of the new corporations became extinct. Neil Fligstein noted in The Transformation of Corporate Control (Harvard University Press, 1990) that only 33 of the top 100 US companies of 1912 were still in the list in 1979. Artificial corporate organisms are like their flesh-endowed counterparts, both fallible and mortal. Indeed, each year more than 10% of all US companies disappear.

So why do most companies fail — and can management consultants, economists and business gurus do anything to reverse this apparently inexorable trend? Ormerod examines a host of complex systems, including societies, corporations, species, ecologies and government social policies. Can failure in such diverse systems be explained by a general theory? This is an unnerving suggestion, as the prediction of uncertain futures is far from easy. Consider, for example, the fundamental indeterminacy at the subatomic level described by quantum mechanics. Chaos theory tells us that small changes to the starting conditions can have immense consequences, and the theory of computation suggests that there may be no faster way of determining the behaviour of a non-equilibrium system than watching it unfold.

Ormerod attributes the failure to predict phenomena as diverse as the demise of the blue-chip companies Enron and WorldCom, the failure of Coca-Cola's ‘New Coke’ in the 1980s, and the collapse of the Soviet Union to an outdated analytical methodology. Traditional ‘general equilibrium theory’ economics envisages a platonic ideal in which companies are systems in equilibrium with perfect access to information and an unlimited ability to analyse it. Agents act rationally, landscapes are static, and the state of the system can be computed using differential calculus. Recent modifications incorporating game theory or bounded rationality are no different.

These methods have largely failed, and it was discovered that the five great periods of biological extinction, the inventory of failed multinational companies, and phenomena as diverse as stock-market crashes, biological phenomena and the structure of contacts on the World Wide Web are described by a ‘power law’. Clearly, deeper forces are at work. In a power law, the frequency of an event falls away with the square of its size. These causal factors emerge not from outside perturbations, but from the intrinsic dynamics of highly interconnected networks that are far from equilibrium. The fascinating generic behaviour of these networks and the generation of complex behaviour from the iteration of simple rules has been beautifully described by Stuart Kauffman and Stephen Wolfram. So it is a shame that Ormerod does not discuss how the invisible hand of emergent network behaviour makes its presence felt.

There are some gems nevertheless, and the scale and breadth of Ormerod's analysis deserves commendation. Most interesting is the way in which power laws challenge conventional notions of causality. The stock-market crash of September 1987, for example, in which the Dow Jones index collapsed by 20% in a single day, may not ultimately have had a distinct cause, as catastrophic events may occasionally have insignificant causes. More important, within Ormerod's framework, successful institutions evolve organically, indicating that excessive government intervention may be both unnecessary and counterproductive.