How do moral systems evolve? The common view, rooted in game theory (see box, overleaf), is that cooperation and mutual aid require tight partnerships among individuals, or close kinship1. But this dogma is now being shaken and, on page 573 of this issue, Nowak and Sigmund2 report a new mathematical model to show that cooperation can become established even if recipients have no chance to return the help to their helper. This is because helping improves reputation, which in turn makes one more likely to be helped.
Nowak and Sigmund's stance differs radically from classical approaches, because they assume that any two organisms are unlikely to interact twice. This excludes any direct reciprocity, meaning that you should not expect your assistance to be returned by somebody you helped on a previous encounter, but by somebody else. For such indirect reciprocity to work, individuals must assess others in their group by watching their games from the sidelines and assigning ‘image scores’ to them. If you watch one individual helping another, you can allot him or her one point of image score; but if you witness the individual refusing to help, withdraw one point. Then, when you are asked to help, you will cooperate if your partner's score is high — otherwise, you defect (that is, you don't cooperate). Strategies can be more or less discriminating according to their ‘score cut’. For example, a ‘cooperative’ strategy has a score cut of zero or less, meaning that the individual then cooperates with other individuals on their first interaction.
This yields a new dilemma. When cooperating, your only reward is a score rise, at the cost of helping. You bet that the benefit of helping will be returned by someone who acknowledges your good reputation. This works, like an insurance principle, as soon as enough clients subscribe to it. On the other hand, if you punish a low scorer by defecting, this blemishes your own reputation. The punishment is possibly unfair — if the partner is a discriminator who had the bad luck to encounter low-score players, you will not help him. (This is in contrast to strategies based on reputation, where a defection provoked by the co-player's misbehaviour inflicts no black mark3.)
To probe the success of discriminating strategies, Nowak and Sigmund2 have simulated the effect of natural selection on a population that includes various levels of discrimination. These range from blind cooperation (where every co-player receives help, no matter what his score) to unconditional defection (where no co-player is ever considered worthy of help). In every generation, there is a given — usually low — number of pair-wise interactions, and the frequency of a strategy increases in proportion to its pay-off. From their simulations, the authors show that the winner is often the cooperative strategy that discriminates the most, the one whose score cut equals zero. That way, cooperation is established even though repeated encounters almost never occur.
The dynamics of the game become more complicated if reproduction is not quite faithful (there may be mutations in the offspring's score cut). Then, the regime of cooperation is occasionally punctuated by bursts of defection, and the underlying mechanism is always the same. Unconditional cooperators spread by drift, and then defectors who can exploit them invade. In most cases, the discriminators return and re-establish cooperation. Surprisingly enough, a simplified model shows that the only way defectors can achieve a lasting triumph is by invading very rarely, because this provides time for stochastic fluctuations to eliminate almost all discriminators. A population must be challenged often if it is to remain immune to defection.
Individuals may not be able to keep track of all scores. Discriminators will spread provided the amount of information that they have about co-players exceeds some threshold. In Nowak and Sigmund's model, this amount is externally imposed. But it should reflect individual features such as the players' cognitive abilities, their propensity to move and the cost incurred for monitoring others. Spatial models could assess the effect of such traits4. For example, reduced mobility might be detrimental, unless it is compensated for by an exchange of information, such as through gossip5 (the speed of rumours is proverbial).
The discriminator strategy has an antecedent — the ‘observer tit for tat’ (OTFT), developed in the context of an iterated prisoner's dilemma6. The OTFT strategy prescribes defection in the first round if the opponent was seen defecting in his last interaction, and thereafter the standard tit for tat. OTFT can hold its own against defectors when the likelihood of repeated interactions decreases. This suggests that a gradual transition from direct to indirect reciprocation may occur as partnerships become looser.
Richard Alexander pioneered the concept of indirect reciprocity to deal with moral systems in human societies7. But what about other species? The Arabian babbler, Turdoides squamiceps (Fig. 1), a gregarious kind of bird, seems to enjoy helping other babblers8. More than that, the babblers actually compete for the position of the donor. Obviously the increase in status compensates for the cost of cooperating. Also, a discriminating choice of partner is sometimes involved in the associations of plants and bird pollinators9. There might not be much morality in birds or plants, yet indirect reciprocity based on image scoring is a tempting explanation for some of their mutualistic inclinations.
Axelrod, R. & Hamilton, W. D. Science 211, 1390–1396 (1981).
Nowak, M. A. & Sigmund, K. Nature 393, 573–577 (1998).
Sugden, R. The Evolution of Rights, Co-operation and Welfare (Blackwell, Oxford, 1986).
Ferrière, R. & Michod, R. E. Am. Nat. 147, 692–717 (1996).
Enquist, M. & Leimar, O. Anim. Behav. 45, 747–757 (1993).
Pollock, G. B. & Dugatkin, L. A. J. Theor. Biol. 159, 25–37 (1992).
Alexander, R. D. The Biology of Moral Systems (Aldine de Gruyter, New York, 1987).
Zahavi, A. J. Avian Biol. 26, 1–3 (1995).
Bull, J. J. & Rice, W. R. J. Theor. Biol. 149, 63–74 (1991).
Sigmund, K. Games of Life (Oxford Univ. Press, 1993).
About this article
Physical Review Letters (2013)
Chaos, Solitons & Fractals (2013)
Physical Review E (2012)
Proceedings of the Royal Society B: Biological Sciences (2008)
Behavioural Processes (2007)