Simple moral code supports cooperation

The evolution of cooperation is a frequently debated topic. A study assessing scenarios in which people judge each other shows that a simple moral rule suffices to drive the evolution of cooperation.

The evolution of cooperation hinges on the benefits of cooperation being shared among those who cooperate1. In a paper in Nature, Santos et al.2 investigate the evolution of cooperation using computer-based modelling analyses, and they identify a rule for moral judgements that provides an especially powerful system to drive cooperation.

Cooperation can be defined as a behaviour that is costly to the individual providing help, but which provides a greater overall societal benefit. For example, if Angela has a sandwich that is of greater value to Emmanuel than to her, Angela can increase total societal welfare by giving her sandwich to Emmanuel. This requires sacrifice on her part if she likes sandwiches. Reciprocity offers a way for benefactors to avoid helping uncooperative individuals in such situations. If Angela knows Emmanuel is cooperative because she and Emmanuel have interacted before, her reciprocity is direct. If she has heard from others that Emmanuel is a cooperative person, her reciprocity is indirect — a mechanism of particular relevance to human societies3.

A strategy is a rule that a donor uses to decide whether or not to cooperate, and the evolution of reciprocal strategies that support cooperation depends crucially on the amount of information that individuals process. Santos and colleagues develop a model to assess the evolution of cooperation through indirect reciprocity. The individuals in their model can consider a relatively large amount of information compared with that used in previous studies.

This increased amount of information is essential for at least two reasons. First, models of direct reciprocity show that having more information allows for many possible strategies, which can paradoxically reduce cooperation4. Does something similar happen for indirect reciprocity? Second, indirect reciprocity requires individuals to assess and disseminate reliable information about each other. In a real-world context, this mechanism is most convincing if the amount of information being processed is not excessive. These two considerations suggest that the most compelling models of indirect reciprocity should be simple and should support cooperation in settings in which many alternative possibilities exist.

In Santos and colleagues’ set-up, social interactions involve three individuals: a donor, a recipient and a bystander. The donor uses a strategy to decide whether or not to cooperate and pay a cost that produces a benefit for the recipient. The bystander witnesses this and, using a rule termed a norm, assigns a reputation to the donor that is communicated to others in the population. In future social interactions, this reputation affects whether the donor receives the benefits of cooperation when taking on the role of a recipient.

One version of this interaction is known as a first-order system. In this scenario, two strategies exist. The donor can cooperate or not cooperate (defect). The bystander considers the donor’s cooperation or defection when using a norm to assign a good or bad reputation.

Yet even in this simple system, four possible norms exist for the bystander: always assign a good reputation; always assign a bad reputation; assign a good reputation if the donor cooperates and a bad reputation if the donor defects; or assign a bad reputation if the donor cooperates and a good reputation if the donor defects. These norms vary in complexity. The first two are independent of the donor’s action and the complexity is low. The latter two norms are dependent on the donor’s action and the complexity is relatively high.

This reflects a general pattern. Give a bystander some information, and the level of complexity can vary between the possible norms. Moreover, the complexity of the most-complex norms increases with the information available, and the scope for increasing complexity is striking. In a second-order system, another component is added to the interaction. For example, both the donor and the bystander consider the reputation of the recipient. This allows 4 possible strategies and 16 possible norms. A third-order system could also include the donor’s reputation, yielding 16 possible strategies and 256 possible norms5.

Santos and colleagues’ fourth-order system additionally allows individuals to consider information about the past reputation of either the recipient or the donor. By incorporating the past, a donor’s reputation is not dependent on a single point in time. In this scenario, 256 strategies and a staggering 65,536 norms are possible.

With ample scope for complexity in place, Santos and colleagues then examined each norm separately, and allowed the strategies used to evolve (the frequency of use of each strategy could change over time). The strategies that prevail, given a particular norm, affect the amount of cooperation that occurs. One norm, termed stern judging, stands out from the glut of conceivable norms as a relatively low-complexity norm that is highly likely to promote the evolution of cooperation.

The essence of stern judging is to assign a good reputation to a donor who cooperates with a good recipient or who defects with a bad recipient, and assign a bad reputation to a donor who defects with a good recipient or who cooperates with a bad recipient (Fig. 1). This is a simple second-order norm that supports the evolution of simple and highly cooperative strategies, and it does so even when tested in higher-order systems. From the profusion of feasible norms, more-complex norms do not improve the evolution of cooperation, at least up to the fourth-order system studied by the authors. This suggests that a relatively simple norm, with its correspondingly simple requirements in terms of processing and disseminating information, can suffice to drive indirect reciprocity.

Figure 1 | The stern-judging rule. Santos et al.2 used a computer-modelling approach to investigate how cooperation might evolve. They investigated scenarios in which a donor can give or refuse help to a recipient depending on the strategy that the donor uses. The donor’s action is judged by a bystander who uses a rule (termed a norm) to judge the donor’s action and assigns a reputation to the donor that the bystander reports to other members of the society. The authors used this system to test 65,536 different norms in terms of each norm’s ability to support the evolution of cooperative strategies. The norm that stood out as being both low complexity and also highly likely to drive the evolution of cooperation is one known as stern judging. This figure shows how the stern-judging norm is used by a bystander to assess a donor’s action and thereby assign the donor a good or bad reputation.

This finding also raises a question for the future. Given so many conceivable norms, why use stern judging? In Santos and colleagues’ system, strategies evolve, but norms do not. In reality, strategies and norms evolve together6. Both the way people behave (strategies) and the way they evaluate behaviour (norms) change over time, and this process almost certainly involves both genetic and cultural components7. Examining the co-evolution of strategies and norms with culture in the mix would be challenging in a fourth-order system, but it would increase our understanding of whether and when we might expect to observe people using reciprocity norms effectively to support cooperation.

In addition, in Santos and colleagues’ work, every bystander in a given simulated population uses the same norm. However, in many social settings, there can be variation in the level of subtlety with which different people evaluate social situations. This kind of variation, which could result in bystanders using norms of different levels of complexity, may or may not8 result in disagreements between individuals about how to assign reputations. If disagreements occur, how much disagreement can indirect reciprocity tolerate before cooperation breaks down?

Finally, large-scale cooperation occurs in human societies9, and efforts to explain how this evolved have generated controversy, possibly because mutually compatible mechanisms are sometimes treated as strict alternatives. Perhaps the next step needed to address this will be to systematically combine multiple mechanisms4, including indirect reciprocity, and to test whether specific combinations of mechanisms are especially potent at promoting the evolution of cooperation.

Nature 555, 169-170 (2018)

doi: 10.1038/d41586-018-02621-x
Nature Briefing

Sign up for the daily Nature Briefing email newsletter

Stay up to date with what matters in science and why, handpicked from Nature and other publications worldwide.

Sign Up


  1. 1.

    Henrich, J. J. Econ. Behav. Org. 53, 3–35 (2004).

  2. 2.

    Santos, F. P., Santos, F. C. & Pacheco, J. M. Nature 555, 242–245 (2018).

  3. 3.

    Alexander, R. D. The Biology of Moral Systems (Routledge, 1987).

  4. 4.

    van Veelen, M., García, J., Rand, D. G. & Nowak, M. A. Proc. Natl Acad. Sci. USA 109, 9929–9934 (2012).

  5. 5.

    Ohtsuki, H. & Iwasa, Y. J. Theor. Biol. 239, 435–444 (2006).

  6. 6.

    Pacheco, J. M., Santos, F. C. & Chalub, F. A. C. C. PLoS Comput. Biol. 2, e178 (2006).

  7. 7.

    Richerson, P. J. & Boyd, R. Not by Genes Alone: How Culture Transformed Human Evolution (Univ. Chicago Press, 2005).

  8. 8.

    Uchida, S. & Sigmund, K. J. Theor. Biol. 263, 13–19 (2010).

  9. 9.

    Richerson, P. et al. Behav. Brain Sci. 39, e30 (2016).

Download references