Chaos, fractals, random graphs and power laws inspire a popular view of complexity in which behaviours that are typically unpredictable and fragile 'emerge' from simple interconnections among like components. But applied to the study of highly evolved systems, this attractively simple view has led to widespread confusion. A different, more rewarding take on complexity focuses on organization, protocols and architecture, and includes the 'emergent' as an extreme special case within a much richer dynamical perspective.

Engineers can learn from biology. Biological systems are robust and evolvable in the face of even large changes in environment and system components, yet can be extremely fragile to small perturbations. Such universally robust yet fragile (RYF) complexity is found wherever we look. Take the evolution of microbes into humans (robustness of lineages on long timescales) punctuated by mass extinctions (extreme fragility). Or diabetes and cancer, conditions resulting from faulty biological control mechanisms, normally so robust as to go unnoticed.

Credit: J. KAPUSTA/IMAGES.COM

But RYF complexity is not confined to biology. The complexity of technology is exploding around us, but in ways that remain largely hidden. Modern institutions and technologies facilitate robustness and accelerate evolution, but also enable major catastrophes, from network crashes to climate change. Such RYF complexity presents a major challenge to engineers, physicians and, increasingly, scientists. Understanding RYF means understanding architecture — the most universal, high-level, persistent elements of organization — and protocols. Protocols define how diverse modules interact, and architecture defines how sets of protocols are organized.

So biologists can learn from engineering. The Internet is an obvious example of how a protocol-based architecture facilitates evolution and robustness. If you are reading this on the Internet, your laptop hardware (display, keyboard and so on) and software (web browser) both obey sets of protocols for exchanging signals and files. Subject to protocol-driven constraints, you can access an incredible diversity of hardware and software resources.

But it is the architecture of TCP/IP (Transmission Control and Internet Protocols) that is more fundamental. The hourglass protocol 'stack' has a thin, hidden 'waist' of universally shared feedback control (TCP/IP) between the visible upper (application software) and lower (hardware) layers. Roughly, IP controls the routes for packet flows and thus, available bandwidth. Applications split files into packets, and TCP controls their rates and guarantees delivery. This allows 'plug-and-play' between modules that obey shared protocols; any set of applications that 'talks' TCP can run transparently and robustly on any set of hardware that talks IP, accelerating the evolution of TCP/IP-based networks.

Similarly, microbial genes that talk transcription and translation protocols can move from one microbe to another by horizontal gene transfer, also accelerating evolution in a kind of bacterial internet. But as with the technological Internet, the newly acquired proteins work better when they can use additional shared protocols such as group transfers. Thus selection acting at the protocol level could evolve and preserve shared architecture, essentially evolving evolvability.

All life and advanced technologies rely on protocol-based architectures. The evolvability of microbes and IP-based networks illustrates how dramatic, novel, dynamic changes on all scales of time and space can also be coherent, responsive, functional and adaptive. New genes and pathways, laptops and applications, even whole networks, can plug-and-play, as long as they obey protocols. Biologists can even swap gene sequences over the Internet in a kind of synthetic horizontal gene transfer.

Typical behaviour is fine-tuned with this elaborate control and thus appears boringly robust despite large internal and external perturbations. As a result, complexity and fragility are largely hidden, often revealed only by catastrophic failures. Because components come and go, control systems that reallocate network resources easily confer robustness to outright failures, whereas violations of protocols by even small random rewiring can be catastrophic. So programmed cell (or component) 'death' is a common strategy to prevent local failures from cascading system-wide.

The greatest fragility stemming from a reliance on protocols is that standardized interfaces and building blocks can be easily hijacked. So that which enables horizontal gene transfer, the web and email also aids viruses and other parasites. Large structured rearrangements can be tolerated, whereas small random or targeted changes that subtly violate protocols can be disastrous.

By contrast, in the popular view of complexity described at the beginning, modelling and analysis are both simplified because tuning, structure and details are minimized, as is environmental uncertainty; and superficial patterns in ensemble averages (not protocols) define modularity. An unfortunate clash of cultures arises because architecture-based RYF complexity is utterly bewildering when viewed from this popular perspective. But the search for a deep simplicity and unity remains a common goal.

Fortunately, our growing need for robust, evolvable technological networks means the tools for engineering architectures and protocols are becoming more accessible. These will bring rigour and relevance to the study of complexity generally, but not at the expense of structure and detail. Quite the contrary: both architectures and theories to study them are most successful when they facilitate rather than ignore the inclusion of domain-specific details and expertise.

FURTHER READING

Doyle et al. Proc. Natl Acad. Sci. USA 102, 14497–14502 (2005).

Moritz, M. A. et al. Proc. Natl Acad. Sci. USA 102, 17912–17917 (2005).