Globally networked risks and how to respond

Journal name:
Date published:
Published online

Today’s strongly connected, global networks have produced highly interdependent systems that we do not understand and cannot control well. These systems are vulnerable to failure at all scales, posing serious threats to society, even when external shocks are absent. As the complexity and interaction strengths in our networked world increase, man-made systems can become unstable, creating uncontrollable situations even when decision-makers are well-skilled, have all data and technology at their disposal, and do their best. To make these systems manageable, a fundamental redesign is needed. A ‘Global Systems Science’ might create the required knowledge and paradigm shift in thinking.

At a glance


  1. Risks Interconnection Map 2011 illustrating systemic interdependencies in the hyper-connected world we are living in.
    Figure 1: Risks Interconnection Map 2011 illustrating systemic interdependencies in the hyper-connected world we are living in.

    Reprinted from ref. 82 with permission of the WEF.

  2. Spreading and erosion of cooperation in a prisoner/'s dilemma game.
    Figure 2: Spreading and erosion of cooperation in a prisoner’s dilemma game.

    The computer simulations assume the payoff parameters T = 7, R = 6, P = 2, and S = 1 and include success-driven migration32. Although cooperation would be profitable to everyone, non-cooperators can achieve a higher payoff than cooperators, which may destabilize cooperation. The graph shows the fraction of cooperative agents, averaged over 100 simulations, as a function of the connection density (actual number of network links divided by the maximum number of links when all nodes are connected to all others). Initially, an increasing link density enhances cooperation, but as it passes a certain threshold, cooperation erodes. (See for a related movie.) The computer simulations are based on a circular network with 100 nodes, each connected with the four nearest neighbours. n links are added randomly. 50 nodes are occupied by agents. The inset shows a ‘snapshot’ of the system: blue circles represent cooperation, red circles non-cooperative behaviour, and black dots empty sites. Initially, all agents are non-cooperative. Their network locations and behaviours (cooperation or defection) are updated in a random sequential way in 4 steps: (1) The agent plays two-person prisoner’s dilemma games with its direct neighbours in the network. (2) After the interaction, the agent moves with probability 0.5 up to 4 steps along existing links to the empty node that gives the highest payoff in a fictitious play step, assuming that noone changes the behaviour. (3) The agent imitates the behaviour of the neighbour who got the highest payoff in step 1 (if higher than the agent’s own payoff). (4) The behaviour is spontaneously changed with a mutation rate of 0.1.

  3. Illustration of probabilistic cascade effects in systems with networked risks.
    Figure 3: Illustration of probabilistic cascade effects in systems with networked risks.

    The orange and blue paths show that the same cause can have different effects, depending on the respective random realization. The blue and red paths show that different causes can have the same effect. The understanding of cascade effects requires knowledge of at least the following three contributing factors: the interactions in the system, the context (such as institutional or boundary conditions), and in many cases, but not necessarily so, a triggering event (i.e. randomness may determine the temporal evolution of the system). While the exact timing of the triggering event is often not predictable, the post-trigger dynamics might be foreseeable to a certain extent (in a probabilistic sense). When system components behave randomly, a cascade effect might start anywhere, but the likelihood to originate ata weak part of the system is higher (e.g. traffic jams mostly start at known bottlenecks, but not always).

  4. Cascade spreading is increasingly hard to recover from as failure progresses.
    Figure 4: Cascade spreading is increasingly hard to recover from as failure progresses.

    The simulation model mimics spatial epidemic spreading with air traffic and healing costs in a two-dimensional 50×50 grid with periodic boundary conditions and random shortcut links. The colourful inset depicts an early snapshot of the simulation with N = 2,500 nodes. Red nodes are infected, green nodes are healthy. Shortcut links are shown in blue. The connectivity-dependent graph shows the mean value and standard deviation of the fraction i(t)/N of infected nodes over 50 simulation runs. Most nodes have four direct neighbours, but a few of them possess an additional directed random connection to a distant node. The spontaneous infection rate is s = 0.001 per time step; the infection rate by an infected neighbouring node is P = 0.08. Newly infected nodes may infect others or may recover from the next time step onwards. Recovery occurs with a rate q = 0.4, if there is enough budget b>c to bear the healing costs c = 80. The budget needed for recovery is created by the number of healthy nodes h(t). Hence, if r(t) nodes are recovering at time t, the budget changes according to b(t+1) = b(t)+h(t)cr(t). As soon as the budget is used up, the infection spreads explosively. (See also the movie at

  5. Box 3 Figure Illustration of the principle of a /`time bomb/'.
    Figure 5: Box 3 Figure Illustration of the principle of a ‘time bomb’.

    A single, local perturbation of a node may cause large-scale damage through a cascade effect, similar to chain reactions in nuclear fission.


  1. World Economic Forum. Global Risks 2012 and 2013 (WEF, 2012 and 2013);
  2. Rinaldi, S. M., Peerenboom, J. P. & Kelly, T. K. Critical infrastructure interdependencies. IEEE Control Syst. 21, 1125 (2001)
  3. Rosato, V. et al. Modelling interdependent infrastructures using interacting dynamical models. Int. J. Critical Infrastruct. 4, 6379 (2008)
  4. Buldyrev, S. V., Parshani, R., Paul, G., Stanley, H. E. & Havlin, S. Catastrophic cascade of failures in interdependent networks. Nature 464, 10251028 (2010)
  5. Gao, J., Buldyrev, S. V., Havlin, S. & Stanley, H. E. Robustness of networks of networks. Phys. Rev. Lett. 107, 195701 (2011)
  6. Vespignani, A. The fragility of interdependency. Nature 464, 984985 (2010)
  7. Brockmann, D., Hufnagel, L. & Geisel, T. The scaling laws of human travel. Nature 439, 462465 (2006)
  8. Vespignani, A. Predicting the behavior of techno-social systems. Science 325, 425428 (2009)
  9. Epstein, J. M. Modelling to contain pandemics. Nature 460, 687 (2009)
  10. Crutzen, P. & Stoermer, E. The anthropocene. Global Change Newsl. 41, 1718 (2000)
  11. Helbing D., Carbone A., eds. Participatory science and computing for our complex world. Eur. Phys. J. Spec. Top. 214, (special issue). 1666 (2012)
  12. Zeeman E. C., ed. Catastrophe Theory (Addison-Wesley, 1977)
  13. Stanley, H. E. Introduction to Phase Transitions and Critical Phenomena (Oxford Univ. Press, 1987)
  14. Watts, D. J. A simple model of global cascades on random networks. Proc. Natl Acad. Sci. USA 99, 57665771 (2002)
  15. Motter, A. E. Cascade control and defense in complex networks. Phys. Rev. Lett. 93, 098701 (2004)
  16. Simonsen, I., Buzna, L., Peters, K., Bornholdt, S. & Helbing, D. Transient dynamics increasing network vulnerability to cascading failures. Phys. Rev. Lett. 100, 218701 (2008)
  17. Little, R. G. Controlling cascading failure: understanding the vulnerabilities of interconnected infrastructures. J. Urban Technol. 9, 109123 (2002)
    This is an excellent analysis of the role of interconnectivity in catastrophic failures.
  18. Buzna, L., Peters, K., Ammoser, H., Kühnert, C. & Helbing, D. Efficient response to cascading disaster spreading. Phys. Rev. E 75, 056107 (2007)
  19. Lorenz, J., Battiston, S. & Schweitzer, F. Systemic risk in a unifying framework for cascading processes on networks. Eur. Phys. J. B 71, 441460 (2009)
    This paper gives a good overview of different classes of cascade effects with a unifying theoretical framework.
  20. Battiston, S., Delli Gatti, D., Gallegati, M., Greenwald, B. & Stiglitz, J. E. Default cascades: when does risk diversification increase stability? J. Financ. Stab. 8, 138149 (2012)
  21. Albeverio S., Jentsch V., Kantz H., eds. Extreme Events in Nature and Society (Springer, 2010)
  22. Bak, P., Tang, C. & Wiesenfeld, K. Self-organized criticality: an explanation of the 1/f noise. Phys. Rev. Lett. 59, 381384 (1987)
  23. Albert, R., Jeong, H. & Barabasi, A. L. Error and attack tolerance of complex networks. Nature 406, 378382 (2000)
  24. Kun, F., Carmona, H. A., Andrade, J. S. Jr & Herrmann, H. J. Universality behind Basquin’s law of fatigue. Phys. Rev. Lett. 100, 094301 (2008)
  25. Achlioptas, D., D’Souza, R. M. & Spencer, J. Explosive percolation in random networks. Science 323, 14531455 (2009)
  26. Sornette, D. & Ouillon, G. Dragon-kings: mechanisms, statistical methods and empirical evidence. Eur. Phys. J. Spec. Top. 205, 126 (2012)
  27. Nicolis, G. Introduction to Nonlinear Science (Cambridge Univ. Press, 1995)
  28. Strogatz, S. H. Nonlinear Dynamics and Chaos (Perseus, 1994)
  29. Liu, Y. Y., Slotine, J. J. & Barabasi, A. L. Controllability of complex networks. Nature 473, 167173 (2011)
  30. Dörner, D. The Logic of Failure (Metropolitan, 1996)
    This book is a good demonstration that we tend to make wrong decisions when trying to manage complex systems.
  31. Nowak, M. A. Evolutionary Dynamics (Belknap, 2006)
  32. Helbing, D. Social Self-Organization (Springer, 2012)
    This book offers an integrative approach to agent-based modelling of emergent social phenomena, systemic risks in social and economic systems, and how to manage complexity.
  33. Johansson, A., Helbing, D., Al-Abideen, H. Z. & Al-Bosta, S. From crowd dynamics to crowd safety: a video-based analysis. Adv. Complex Syst. 11, 497527 (2008)
  34. Helbing, D. & Mukerji, P. Crowd disasters as systemic failures: analysis of the Love Parade disaster. Eur. Phys. J. Data Sci. 1, 7 (2012)
  35. Bettencourt, L. M. A. et al. Growth, innovation, scaling and the pace of life in cities. Proc. Natl Acad. Sci. USA 104, 73017306 (2007)
  36. Ball, P. Why Society is a Complex Matter (Springer, 2012)
  37. Aven T., Vinnem J. E., eds. Risk, Reliability and Societal Safety Vols 1–3 (Taylor and Francis, 2007)
    This compendium is a comprehensive source of information about risk, reliability, safety and resilience.
  38. Rodriguez H., Quarantelli E. L., Dynes R. R., eds. Handbook of Disaster Research (Springer, 2007)
  39. Cox, L. A. Jr Risk Analysis of Complex and Uncertain Systems (Springer, 2009)
  40. Perrow, C. Normal Accidents. Living with High-Risk Technologies (Princeton Univ. Press, 1999)
    This eye-opening book shows how catastrophes result from couplings and complexity.
  41. Peters, G. A. & Peters, B. J. Human Error. Causes and Control (Taylor and Francis, 2006)
    This book is a good summary of why, how and when people make mistakes.
  42. Clarke, L. Worst Cases (Univ. Chicago, 2006)
  43. Axelrod, R. & Cohen, M. D. Harnessing Complexity (Basis Books, 2000)
    This book offers a good introduction into complex social systems and bottom-up management.
  44. Tumer, K. & Wolpert, D. H. Collectives and the Design of Complex Systems (Springer, 2004)
  45. Lämmer, S. & Helbing, D. Self-control of traffic lights and vehicle flows in urban road networks. J. Stat. Mech. P04019 (2008)
  46. Perkins, C. E. & Royer, E. M. Ad-hoc on-demand distance vector routing. In Second IEEE Workshop on Mobile Computing Systems and Applications 90–100 (WMCSA Proceedings, 1999)
  47. Amin, M. M. & Wollenberg, B. F. Toward a smart grid: power delivery for the 21st century. IEEE Power Energy Mag. 3, 3441 (2005)
  48. Schneider, C. M., Moreira, A. A., Andrade, J. S. Jr, Havlin, S. & Herrmann, H. J. Mitigation of malicious attacks on networks. Proc. Natl Acad. Sci. USA 108, 38383841 (2011)
  49. Comfort L. K., Boin A., Demchak C. C., eds. Designing Resilience. Preparing for Extreme Events (Univ. Pittsburgh, 2010)
  50. Scheffer, M. et al. Early-warning signals for critical transitions. Nature 461, 5359 (2009)
  51. Pikovsky, A., Rosenblum, M. & Kurths, J. Synchronization (Cambridge Univ. Press, 2003)
  52. Haldane, A. G. & May, R. M. Systemic risk in banking ecosystems. Nature 469, 351355 (2011)
  53. Battiston, S., Puliga, M., Kaushik, R., Tasca, P. & Caldarelli, G. DebtRank: too connected to fail? Financial networks, the FED and systemic risks. Sci. Rep. 2, 541 (2012)
  54. Stiglitz, J. E. Freefall: America, Free Markets, and the Sinking of the World Economy (Norton & Company, 2010)
  55. Sterman, J. Business Dynamics: Systems Thinking and Modeling for a Complex World (McGraw-Hill/Irwin, 2000)
  56. Helbing, D. & Lämmer, S. in Networks of Interacting Machines: Production Organization in Complex Industrial Systems and Biological Cells (eds Armbruster, D., Mikhailov, A. S. & Kaneko, K.) 3366 (World Scientific, 2005)
  57. Young, H. P. Innovation diffusion in heterogeneous populations: contagion, social influence, and social learning. Am. Econ. Rev. 99, 18991924 (2009)
  58. Montanari, A. & Saberi, A. The spread of innovations in social networks. Proc. Natl Acad. Sci. USA 107, 2019620201 (2010)
  59. Grund, T., Waloszek, C. & Helbing, D. How natural selection can create both self- and other-regarding preferences, and networked minds. Sci. Rep. 72 1480 (2013)
  60. Lazer, D. et al. Computational social science. Science 323, 721723 (2009)
  61. Epstein, J. M. & Axtell, R. L. Growing Artificial Societies: Social Science from the Bottom Up (Brookings Institution, 1996)
    This is a groundbreaking book on agent-based modelling.
  62. Gilbert, N. & Bankes, S. Platforms and methods for agent-based modeling. Proc. Natl Acad. Sci. USA 99 (S3). 71977198 (2002)
  63. Farmer, J. D. & Foley, D. The economy needs agent-based modeling. Nature 460, 685686 (2009)
  64. Szell, M., Sinatra, R., Petri, G., Thurner, S. & Latora, V. Understanding mobility in a social petri dish. Sci. Rep. 2, 457 (2012)
  65. de Freitas, S. Game for change. Nature 470, 330331 (2011)
  66. McNeil, A. J., Frey, R. & Embrechts, P. Quantitative Risk Management (Princeton Univ. Press, 2005)
  67. Preis, T., Kenett, D. Y., Stanley, H. E., Helbing, D. & Ben-Jacob, E. Quantifying the behaviour of stock correlations under market stress. Sci. Rep. 2, 752 (2012)
  68. Floriano, D. & Mattiussi, C. Bio-Inspired Artificial Intelligence (MIT Press, 2008)
  69. Pentland, A. Society’s nervous system: building effective government, energy, and public health systems. IEEE Computer 45, 3138 (2012)
  70. Kesting, A., Treiber, M., Schönhof, M. & Helbing, D. Adaptive cruise control design for active congestion avoidance. Transp. Res. C 16, 668683 (2008)
  71. Fowler, J. H. & Christakis, N. A. Dynamic spread of happiness in a large social network. Br. Med. J. 337, a2338 (2008)
  72. Helbing, D. & Johansson, A. Cooperation, norms, and revolutions: a unified game-theoretical approach. PLoS ONE 5, e12530 (2010)
  73. Seydel, R. U. Practical Bifurcation and Stability Analysis (Springer, 2009)
  74. Bak, P., Christensen, K., Danon, L. & Scanlon, T. Unified scaling law for earthquakes. Phys. Rev. Lett. 88, 178501 (2002)
  75. Helbing, D. Traffic and related self-driven many-particle systems. Rev. Mod. Phys. 73, 10671141 (2001)
  76. Lozano, S., Buzna, L. & Diaz-Guilera, A. Role of network topology in the synchronization of power systems. Eur. Phys. J. B 85, 231238 (2012)
  77. Schuster, H. G. & Just, W. Deterministic Chaos (Wiley-VCH, 2005)
  78. Wiener, N. Cybernetics (MIT Press, 1965)
  79. Beale, N. et al. Individual versus systemic risk and the regulator’s dilemma. Proc. Natl Acad. Sci. USA 108, 1264712652 (2011)
  80. Allen, P. M. Evolution, population dynamics, and stability. Proc. Natl Acad. Sci. USA 73, 665668 (1976)
  81. Tainter, J. The Collapse of Complex Societies (Cambridge Univ. Press, 1988)
  82. The World Economic Forum, Global Risks 2011 6th edn (WEF, 2011);
  83. Huntington, S. P. The clash of civilisations? Foreign Aff. 72, 2249 (1993)
  84. Cederman, L. E. Endogenizing geopolitical boundaries with agent-based modeling. Proc. Natl Acad. Sci. USA 99 (suppl. 3). 72967303 (2002)
  85. Johnson, N. et al. Pattern in escalations in insurgent and terrorist activity. Science 333, 8184 (2011)
  86. Beck, U. Risk Society (Sage, 1992)
  87. Lin, N. Social Capital (Routeledge, 2010)
  88. Kröger, W. & Zio, E. Vulnerable Systems (Springer, 2011)

Download references

Author information


  1. ETH Zurich, Clausiusstrasse 50, 8092 Zurich, Switzerland

    • Dirk Helbing
  2. Risk Center, ETH Zurich, Swiss Federal Institute of Technology, Scheuchzerstrasse 7, 8092 Zurich, Switzerland

    • Dirk Helbing

Competing financial interests

The author declares no competing financial interests.

Corresponding author

Correspondence to:

Author details

Additional data