Skip to main content

SPECIALTY GRAND CHALLENGE article

Front. Phys., 11 October 2018
Sec. Social Physics
Volume 6 - 2018 | https://doi.org/10.3389/fphy.2018.00107

Grand Challenges in Social Physics: In Pursuit of Moral Behavior

  • 1Department of Economics, Middlesex University, The Burroughs, London, United Kingdom
  • 2Faculty of Natural Sciences and Mathematics, University of Maribor, Maribor, Slovenia
  • 3Complexity Science Hub Vienna, Vienna, Austria
  • 4School of Electronic and Information Engineering, Beihang University, Beijing, China

Methods of statistical physics have proven valuable for studying the evolution of cooperation in social dilemma games. However, recent empirical research shows that cooperative behavior in social dilemmas is only one kind of a more general class of behavior, namely moral behavior, which includes reciprocity, respecting others' property, honesty, equity, efficiency, as well as many others. Inspired by these experimental works, we here open up the path toward studying other forms of moral behavior with methods of statistical physics. We argue that this is a far-reaching direction for future research that can help us answer fundamental questions about human sociality. Why did our societies evolve as they did? What moral principles are more likely to emerge? What happens when different moral principles clash? Can we predict the break out of moral conflicts in advance and contribute to their solution? These are amongst the most important questions of our time, and methods of statistical physics could lead to new insights and contribute toward finding answers.

1. Introduction

Our time now is unique and special in that we are arguably richer, safer, and healthier than ever before [1, 2], but simultaneously, we are also facing some of the greatest challenges of our evolution. Climate change, the depletion of natural resources, staggering inequality, the spread of misinformation, persistent armed conflicts, just to name a few examples, all require our best efforts to act together and to renounce part of our individual interests for the greater good. Understanding when, why, and how people deviate from their best self-interest to act pro-socially, benefitting other people and the society as a whole, is thus amongst the most important aims of contemporary scientific research.

Pro-social behavior can come in many forms, the most studied of which is cooperation. Indeed, cooperation is so important that many have contended that our capacity to cooperate at large scales with unrelated others is what makes human societies so successful [314]. Moreover, the psychological basis of cooperation, shared intentionality, that is, “the ability and motivation to engage with others in collaborative, co-operative activities with joint goals and intentions” is what makes humans uniquely human, as it is possessed by children, but not by great apes [15].

Although human cooperation is believed to originate from our evolutionary struggles for survival [16], it is clear that the challenges that pressured our ancestors into cooperation today are gone. Nevertheless, we are still cooperating, and on ever larger scales, to the point that we may deserve being called “SuperCooperators” [17]. Taking nothing away from the immense importance of cooperation for our evolutionary success and for the wellbeing of our societies, recent empirical research shows, however, that to cooperate is just a particular manifestation of moral behavior [18]. And while methods of statistical physics have been used prolifically to study cooperation [14], other forms of moral behavior have not. Our goal here is to discuss and outline the many possibilities for future research at the interface between physics and moral behavior, beyond the traditional framework of cooperation in social dilemmas.

2. Cooperation

To study cooperative behavior, scientists use social dilemma games, such as the prisoner's dilemma [19], the stag hunt [20], or the public goods game [21]. In these games, players have to decide whether to cooperate or to defect: cooperation maximizes the payoff of the group, while defection maximizes the payoff of an individual. This leads to a conflict between individual and group interests, which is at the heart of each social dilemma, and in particular at the heart of the cooperation problem.

Since cooperating is not individually optimal, cooperative behavior cannot evolve among self-interested individuals, unless other mechanisms are at play. Several mechanisms for the evolution of cooperation have been identified and studied, including kin selection [22], direct reciprocity [23], indirect reciprocity [24], social preferences [2528], the internalization of social heuristics [29], translucency [30], cooperative equilibria [12, 31, 32], as well as many others.

One realistic mechanism for the evolution of cooperation is network reciprocity. Everyday interactions among humans do not happen in a vacuum. We are more likely to interact and cooperate within our network of family members, friends, and coworkers, and we rarely interact, let alone cooperate, with strangers. One can formalize this situation by assuming that individuals occupy the vertices of a graph and interact only with their neighbors. Can this spatial structure promote the evolution of cooperation? The answer is yes [33]. And the intuition is that, in this setting, cooperators can form clusters and protect themselves from the invasion of defectors [3436]. These “games on graphs are difficult to analyze mathematically,” but “are easy to study by computer simulations” [9]. Games on networks present the natural setting in which one can apply the techniques and methods of statistical physics and network science to study cooperation [37, 38], as well as other forms of moral behavior.

3. Statistical Physics of Human Cooperation

Methods of statistical physics have come a long way in improving our understanding of the emergence of cooperation and the phase transitions leading to other counterintuitive evolutionary outcomes. Research has revealed that these depend sensitively on the structure of the interaction network and the type of interactions, as well as on the number and type of competing strategies [3949]. Aspects particularly relevant to human cooperation have also been studied in much detail [14]. The workhorse behind this research has been the spatial public goods game [50, 51], with extensions toward different forms of punishment [5258], rewarding [5962], and tolerance [63], to name just some examples. The Monte Carlo method is thereby typically used [64], which ensures that the treatment is aligned with fundamental principles of statistical physics, enabling a comparison of obtained results with generalized mean-field approximations [6567] and a proper determination of phase transitions between different stable strategy configurations [45]. The goal is to identify and understand pattern formation, the spatiotemporal dynamics of solutions, and the principles of self-organization that may lead to socially favorable evolutionary outcomes.

An example of an impressively intricate phase diagram, obtained from studying an 8-strategy public goods game with diverse tolerance levels, is presented in Figure 1 of Szolnoki and Perc [63]. There, it can be observed that the higher the value of the multiplication factor in the public goods game, the higher the tolerance can be, and vice versa. This observation resonates with our naive expectation and perception of tolerance in that overly tolerant strategies cannot survive in the presence of other less tolerant strategies. From the viewpoint of the considered evolutionary game this is not surprising, because players adopting the most tolerant strategy act as loners only if everybody else in the group is a defector. And such sheer unlimited tolerance is simply not competitive with other less tolerant strategies. Also, if the cost of inspection is too high, or if the value of the multiplication factor is either very low or very high, then tolerant players cannot survive even if they exhibit different levels of tolerance.

While it is beyond the scope of this work to go further into details, it should be noted that phase diagrams as the one presented in Figure 1 of Szolnoki and Perc [63] provide an in-depth understanding of the evolutionary dynamics and of the phase transitions that lead from one stable strategy configuration to the other. The key for obtaining accurate locations of phase transition points and the correct phases is the application of the stability analysis of competing subsystem solutions [64]. A subsystem solution can be formed by any subset of all the competing strategies, and on their own (if separated from other strategies) these subsystems solutions are stable. This is trivially true if the subsystem solution is formed by a single strategy, but can be likewise true if more than one strategy forms such a solution. The dominant subsystem solution, and hence the phase that is ultimately depicted in the phase diagram as the stable solution of the whole system, can only be determined by letting all the subsystem solutions compete against each other.

By means of this approach, several important insights have been obtained. By peer-based strategies, for example, we note the importance of indirect territorial competition in peer punishment [68], the spontaneous emergence of cyclic dominance in rewarding [59], and an exotic first-order phase transition observed with correlated strategies [69]. By institutionalized strategies, we have the fascinating spatiotemporal complexity that is due to pool punishment [53], while in the realm of self-organization of incentives for cooperation, we have the elevated effectiveness of adaptive punishment [54], the possibility of probabilistic sharing to solve the problem of costly punishment [56], and the many evolutionary advantages of adaptive rewarding [61]. With antisocial strategies, we have the restoration of the effectiveness of prosocial punishment when accounting for second-order free-riding on antisocial punishment [58], and the rather surprising lack of adverse effects with antisocial rewarding [62].

While this is just a short snippet of statistical physics research concerning human cooperation, it hopefully showcases successfully the potency of the approach for studying complex mathematical models that describe human behavior, thus recommending itself also for relevantly studying other types of moral behavior to which we attend to in what follows.

4. Moral Behavior

Empirical research has indeed shown that cooperation in social dilemmas is only one facet of a more general class of behavior – moral behavior. When subjects are asked to report what they think is the morally right thing to do in social dilemmas, they typically answer: “to cooperate” [18].

Morality is universal across human societies. Virtually all societies adopt behavioral rules that are presented to the people as moral principles. But where do these rules come from? A classical non-scientific explanation, still adopted by many societies and religious thinkers, is that they are emanated directly from God. However, in recent years, social scientists have been developing a scientific theory of morality, according to which morality evolved as a a mechanism “to promote and sustain cooperation” [70]. As psychology-star Michael Tomasello put it: “human morality arose evolutionarily as a set of skills and motives for cooperating with others” [71]. Similar positions have also been put forward in Rawls [72], Mackie [73], Wong [74], Rai and Fiske [75], Curry [76], and Curry et al. [77]. However, the word “cooperation” in these statements does not refer only to cooperation in social dilemmas. How does this general form of cooperation translates into specific behaviors?

A recent study exploring morality in 60 societies across the world found that seven moral rules are universal: love your family, help your group, return favors, be brave, defer to authority, be fair, and respect others' property. Although, what is not universal is how they are ranked [77]. Of course, not all these rules are easy to study using simple games on networks, but some are. For example, “returning favors” can be studied using a sequential prisoner's dilemma, in which the players do not choose their strategy simultaneously, but sequentially. Alternatively, it can be studied using the trust game [78]. In the trust game, player 1 starts with a sum of money and has to decide how much of it, if any, to transfer to player 2. Any amount transferred gets multiplied by a factor larger than 1 and handed to player 2. Then player 2 has to decide how much of it to keep and how much of it to return to player 1.

Similarly, “help your group” can be studied using games with labeled players, in which agents come with a label representing the group(s) they belong to [79]. “Fairness” can be studied using the ultimatum game [80], as has already been done along these lines [45, 8190], or the dictator game [91]. “Respect others' property” can be studied using games with special frames, as, for example, the Dictator game in the Take frame, for which it is known that taking is considered to be more morally wrong than giving [92].

Beyond these seven rules, there are other forms of moral behavior that are worth studying, as, for example, “honesty.” A common game theoretic paradigm to study honest behavior is the sender-receiver game [93]. In this game, player 1 is given a private information (for example, the outcome of a die) and is asked to communicate this piece of information to player 2. Player 1 can either communicate the truthful piece of information, or can lie. The role of player 2 is to guess the original piece of information. If player 2 guesses the original piece of information, then players 1 and 2 are both paid according to some option A. Conversely, if player 2 does not guess the original piece of information, then players 1 and 2 are both paid according to option B. Crucially, only player 1 knows the payoffs associated with options A and B. A variant of this game in which player 2 makes no choice has also been introduced and studied [94, 95], in order to avoid the confound of sophisticated deception, that is, players who tell the truth because they believe that player 2 will not believe them [96].

Other important forms of moral behavior that ought to be investigated are “equity,” that is, a desire to minimize payoff differences among players; “efficiency,” that is, a desire to maximize the total welfare; and “maximin,” that is, a desire to maximize the worse off payoff. These types of behavior are usually studied using simple distribution experiments, in which people have to decide between two or more allocations of money [18, 27, 28, 97, 98].

5. Discussion

Methods of statistical physics and network science have proven to be very valuable for successfully studying the evolution of cooperation in social dilemma games. However, empirical research shows that this kind of behavior is only one form of a more general class of moral behavior. The later includes love your family, help your group, return favors, be brave, defer to authority, be fair, respect others' property, honesty, equity, and efficiency, as well as many others. We have outlined a set of games and mathematical models that could be used efficiently to study particular aspects of some of these forms of moral behavior.

Taken together, the application of statistical physics to study the evolution of moral behavior has the potential to become a flourishing and vibrant avenue for future research. We believe so for two reasons. In the first place, it would allow us to understand why our societies evolved as they did and which moral principles are more likely to evolve. Secondly, since many social conflicts are ultimately conflicts between different moral positions [99101], exploring the evolution of moral behavior could allow us to predict in advance the consequences of a moral conflict, and suggest strategies to avoid it or, in case it is unavoidable, strategies to minimize its costs. We hope that at least parts of our vision will be put to practice in the near future.

Author Contributions

All authors listed have made a substantial, direct and intellectual contribution to the work, and approved it for publication.

Conflict of Interest Statement

The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

Acknowledgments

This work was supported by the Slovenian Research Agency (Grants J1-7009 and P5-0027).

References

1. Pinker S. The Better Angels of Our Nature: The Decline of Violence in History and Its Causes. New York, NY: Penguin uk (2011).

Google Scholar

2. Pinker S. Enlightenment Now: The Case for Reason, Science, Humanism, and Progress. New York, NY: Penguin (2018).

3. Ostrom E. Collective action and the evolution of social norms. J Economic Perspect. (2000) 14:137–58. doi: 10.1257/jep.14.3.137

CrossRef Full Text | Google Scholar

4. Fehr E, Gächter S. Altruistic punishment in humans. Nature (2002) 415:137. doi: 10.1038/415137a

PubMed Abstract | CrossRef Full Text | Google Scholar

5. Milinski M, Semmann D, Krambeck H-J. Reputation helps solve the tragedy of the commons. Nature (2002) 415:424. doi: 10.1038/415424a

PubMed Abstract | CrossRef Full Text | Google Scholar

6. Gintis H, Bowles S, Boyd R, Fehr E. Explaining altruistic behavior in humans. Evol Hum Behav. (2003) 24:153–72. doi: 10.1016/S1090-5138(02)00157-5

CrossRef Full Text | Google Scholar

7. Fehr E, Fischbacher U. Social norms and human cooperation. Trends Cogn Sci. (2004) 8:185–90. doi: 10.1016/j.tics.2004.02.007

PubMed Abstract | CrossRef Full Text | Google Scholar

8. Henrich J, McElreath R, Barr A, Ensminger J, Barrett C, Bolyanatz A, et al. Costly punishment across human societies. Science (2006) 312:1767–70. doi: 10.1126/science.1127333

PubMed Abstract | CrossRef Full Text | Google Scholar

9. Nowak MA. Five rules for the evolution of cooperation. Science (2006) 314:1560–3. doi: 10.1126/science.1133755

PubMed Abstract | CrossRef Full Text | Google Scholar

10. Herrmann B, Thöni C, Gächter S. Antisocial punishment across societies. Science (2008) 319:1362–7. doi: 10.1126/science.1153808

PubMed Abstract | CrossRef Full Text | Google Scholar

11. Bowles S, Gintis H. A Cooperative Species: Human Reciprocity and Its Evolution. Princeton, NJ: Princeton University Press (2011).

12. Capraro V. A model of human cooperation in social dilemmas. PLoS ONE (2013) 8:e72427. doi: 10.1371/journal.pone.0072427

PubMed Abstract | CrossRef Full Text | Google Scholar

13. Rand DG, Nowak MA. Human cooperation. Trends Cogn Sci. (2013) 17:413–25. doi: 10.1016/j.tics.2013.06.003

PubMed Abstract | CrossRef Full Text | Google Scholar

14. Perc M, Jordan JJ, Rand DG, Wang Z, Boccaletti S, Szolnoki A. Statistical physics of human cooperation. Phys Reports (2017) 687:1–51. doi: 10.1016/j.physrep.2017.05.004

CrossRef Full Text | Google Scholar

15. Tomasello M, Carpenter M, Call J, Behne T, Moll H. In search of the uniquely human. Behav Brain Sci. (2005) 28:721–7. doi: 10.1017/S0140525X05540123

CrossRef Full Text | Google Scholar

16. Hrdy SB. Mothers and Others: The Evolutionary Origins of Mutual Understanding. Cambridge, MA: Harvard University Press (2011).

Google Scholar

17. Nowak MA, Highfield R. SuperCooperators: Altruism, Evolution, and Why We Need Each Other to Succeed. New York, NY: Free Press (2011).

18. Capraro V, Rand DG. Do the right thing: experimental evidence that preferences for moral behavior, rather than equity or efficiency per se, drive human prosociality. Judg Decis Making (2018) 13:99–111.

Google Scholar

19. Rapoport A, Chammah AM. Prisoner's Dilemma: A study in Conflict and Cooperation, Vol. 165. Ann Arbor, MI: University of Michigan press (1965).

Google Scholar

20. Skyrms B. The Stag Hunt and the Evolution of Social Structure. Cambridge: Cambridge University Press (2004).

Google Scholar

21. Hardin G. The tragedy of the commons. J Natl Resources Policy Res. (2009) 1:243–53. doi: 10.1080/19390450903037302

CrossRef Full Text | Google Scholar

22. Hamilton WD. The genetical evolution of social behaviour. i. J Theor Biol. (1964) 7:1–16.

23. Trivers RL. The evolution of reciprocal altruism. Q Rev Biol. (1971) 46:35–57.

Google Scholar

24. Nowak MA, Sigmund K. Evolution of indirect reciprocity by image scoring. Nature (1998) 393:573. doi: 10.1038/31225

PubMed Abstract | CrossRef Full Text | Google Scholar

25. Fehr E, Schmidt KM. A theory of fairness, competition, and cooperation. Q J Econom. (1999) 114:817–68. doi: 10.1162/003355399556151

CrossRef Full Text | Google Scholar

26. Bolton GE, Ockenfels A. Erc: a theory of equity, reciprocity, and competition. Am Econ Rev. (2000) 90:166–93. doi: 10.1257/aer.90.1.166

CrossRef Full Text | Google Scholar

27. Charness G, Rabin M. Understanding social preferences with simple tests. Q J Econ. (2002) 117:817–69. doi: 10.1162/003355302760193904

CrossRef Full Text | Google Scholar

28. Engelmann D, Strobel M. Inequality aversion, efficiency, and maximin preferences in simple distribution experiments. Am Econ Rev. (2004) 94:857–69. doi: 10.1257/0002828042002741

CrossRef Full Text | Google Scholar

29. Rand DG, Greene JD, Nowak MA. Spontaneous giving and calculated greed. Nature (2012) 489:427. doi: 10.1038/nature11467

PubMed Abstract | CrossRef Full Text | Google Scholar

30. Capraro V, Halpern JY. Translucent players: explaining cooperative behavior in social dilemmas. Proc 15th conf Theoret Aspects Ration Knowl. (2016) 215:114–126. doi: 10.4204/eptcs.215.9

CrossRef Full Text | Google Scholar

31. Halpern JY, Rong N. Cooperative equilibrium. In: Proceedings of the 9th International Conference on Autonomous Agents and Multiagent Systems: volume 1-Vol 1. Toronto, ON: International Foundation for Autonomous Agents and Multiagent Systems (2010). pp. 1465–1466.

Google Scholar

32. Barcelo H, Capraro V. Group size effect on cooperation in one-shot social dilemmas. Sci Reports (2015) 5:7937. doi: 10.1038/srep07937

PubMed Abstract | CrossRef Full Text | Google Scholar

33. Rand DG, Nowak MA, Fowler JH, Christakis NA. Static network structure can stabilize human cooperation. Proc Natl Acad Sci USA. (2014) 111:17093–8. doi: 10.1073/pnas.1400406111

PubMed Abstract | CrossRef Full Text | Google Scholar

34. Nowak MA, May RM. Evolutionary games and spatial chaos. Nature (1992) 359:826. doi: 10.1038/359826a0

CrossRef Full Text | Google Scholar

35. Lieberman E, Hauert C, Nowak MA. Evolutionary dynamics on graphs. Nature (2005) 433:312. doi: 10.1038/nature03204

PubMed Abstract | CrossRef Full Text | Google Scholar

36. Ohtsuki H, Hauert C, Lieberman E, Nowak MA. A simple rule for the evolution of cooperation on graphs and social networks. Nature (2006) 441(7092):502. doi: 10.1038/nature04605

PubMed Abstract | CrossRef Full Text | Google Scholar

37. Perc M, Szolnoki A. Coevolutionary games–A mini review. BioSystems (2010) 99:109–25. doi: 10.1016/j.biosystems.2009.10.003

CrossRef Full Text | Google Scholar

38. Wang Z, Wang L, Szolnoki A, Perc M. Evolutionary games on multilayer networks: a colloquium. Eur Phys J B (2015) 88:124.

Google Scholar

39. Santos FC, Pacheco JM. Scale-free networks provide a unifying framework for the emergence of cooperation. Phys Rev Lett. (2005) 95:098104. doi: 10.1103/PhysRevLett.95.098104

PubMed Abstract | CrossRef Full Text | Google Scholar

40. Pacheco JM, Traulsen A, Nowak MA. Coevolution of strategy and structure in complex networks with dynamical linking. Phys Rev Lett. (2006) 97:258103. doi: 10.1103/PhysRevLett.97.258103

PubMed Abstract | CrossRef Full Text | Google Scholar

41. Gómez-Gardeñes J, Campillo M, Floría LM, Moreno Y. Dynamical organization of cooperation in complex networks. Phys Rev Lett. (2007) 98:108103. doi: 10.1103/PhysRevLett.98.108103

CrossRef Full Text | Google Scholar

42. Ohtsuki H, Nowak MA, Pacheco JM. Breaking the symmetry between interaction and replacement in evolutionary dynamics on graphs. Phys Rev Lett. (2007) 98:108106. doi: 10.1103/PhysRevLett.98.108106

PubMed Abstract | CrossRef Full Text | Google Scholar

43. Lee S, Holme P, Wu Z-X. Emergent hierarchical structures in multiadaptive games. Phys Rev Lett. (2011) 106:028702. doi: 10.1103/PhysRevLett.106.028702

PubMed Abstract | CrossRef Full Text | Google Scholar

44. Mathiesen J, Mitarai N, Sneppen K, Trusina A. Ecosystems with mutually exclusive interactions self-organize to a state of high diversity. Phys Rev Lett. (2011) 107:188101. doi: 10.1103/PhysRevLett.107.188101

PubMed Abstract | CrossRef Full Text | Google Scholar

45. Szolnoki A, Perc M, Szabo G. Defense mechanisms of empathetic players in the spatial ultimatum game. Phys Rev Lett. (2012) 109:078701. doi: 10.1103/PhysRevLett.109.078701

PubMed Abstract | CrossRef Full Text | Google Scholar

46. Assaf M, Mobilia M. Metastability and anomalous fixation in evolutionary games on scale-free networks. Phys Rev Lett. (2012) 109:188701. doi: 10.1103/PhysRevLett.109.188701

PubMed Abstract | CrossRef Full Text | Google Scholar

47. Gómez S, Díaz-Guilera A, Gómez-Gardeñes J, Pérez-Vicente C, Moreno Y, Arenas A. Diffusion dynamics on multiplex networks. Phys Rev Lett. (2013) 110:028701. doi: 10.1103/PhysRevLett.110.028701

PubMed Abstract | CrossRef Full Text | Google Scholar

48. Knebel J, Krüger T, Weber M, Frey E. Coexistence and survival in conservative Lotka-Volterra networks. Phys Rev Lett. (2013) 110:168106. doi: 10.1103/PhysRevLett.110.168106

PubMed Abstract | CrossRef Full Text | Google Scholar

49. Pinheiro F, Santos MD, Santos F, Pacheco J. Origin of peer influence in social networks. Phys Rev Lett. (2014) 112:098702. doi: 10.1103/PhysRevLett.112.098702

PubMed Abstract | CrossRef Full Text | Google Scholar

50. Szolnoki A, Perc M, Szabó G. Topology-independent impact of noise on cooperation in spatial public goods games. Physical Review E (2009) 80:056109. doi: 10.1103/PhysRevE.80.056109

PubMed Abstract | CrossRef Full Text | Google Scholar

51. Perc M. High-performance parallel computing in the classroom using the public goods game as an example. Eur J Phys. (2017) 38:045801. doi: 10.1088/1361-6404/aa6a0e

CrossRef Full Text | Google Scholar

52. Helbing D, Szolnoki A, Perc M, Szabó G. Punish, but not too hard: How costly punishment spreads in the spatial public goods game. N J Phys. (2010) 12:083005. doi: 10.1088/1367-2630/12/8/083005

CrossRef Full Text | Google Scholar

53. Szolnoki A, Szabó G, Perc M. Phase diagrams for the spatial public goods game with pool punishment. Phys Rev E (2011) 83:036101. doi: 10.1103/PhysRevE.83.036101

PubMed Abstract | CrossRef Full Text | Google Scholar

54. Perc M, Szolnoki A. Self-organization of punishment in structured populations. N J Phys. (2012) 14:043013. doi: 10.1088/1367-2630/14/4/043013

CrossRef Full Text | Google Scholar

55. Wang Z, Xia C-Y, Meloni S, Zhou C-S, Moreno Y. Impact of social punishment on cooperative behavior in complex networks. Sci Reports (2013) 3:3055. doi: 10.1038/srep03055

PubMed Abstract | CrossRef Full Text | Google Scholar

56. Chen X, Szolnoki A, Perc M. Probabilistic sharing solves the problem of costly punishment. N J Phys. (2014) 16:083016. doi: 10.1088/1367-2630/16/8/083016

CrossRef Full Text | Google Scholar

57. Chen X, Szolnoki A, Perc M. Competition and cooperation among different punishing strategies in the spatial public goods game. Phys Rev E (2015) 92:012819. doi: 10.1103/PhysRevE.92.012819

PubMed Abstract | CrossRef Full Text | Google Scholar

58. Szolnoki A, Perc M. Second-order free-riding on antisocial punishment restores the effectiveness of prosocial punishment. Phys Rev X (2017) 7:041027. doi: 10.1103/physrevx.7.041027

CrossRef Full Text | Google Scholar

59. Szolnoki A, Perc M. Reward and cooperation in the spatial public goods game. EPL (Europhys Lett.) (2010) 92:38003. doi: 10.1209/0295-5075/92/38003

CrossRef Full Text | Google Scholar

60. Hilbe C, Sigmund K. Incentives and opportunism: from the carrot to the stick. Proc R Soc B (2010) 277:2427–33. doi: 10.1098/rspb.2010.0065

PubMed Abstract | CrossRef Full Text | Google Scholar

61. Szolnoki A, Perc M. Evolutionary advantages of adaptive rewarding. N J Phys. (2012) 14:093016. doi: 10.1088/1367-2630/14/9/093016

CrossRef Full Text | Google Scholar

62. Szolnoki A, Perc M. Antisocial pool rewarding does not deter public cooperation. Proc R Soc B (2015) 282:20151975. doi: 10.1098/rspb.2015.1975

CrossRef Full Text | Google Scholar

63. Szolnoki A, Perc M. Competition of tolerant strategies in the spatial public goods game. N J Phys. (2016) 18:083021. doi: 10.1088/1367-2630/18/8/083021

CrossRef Full Text | Google Scholar

64. Perc M. Stability of subsystem solutions in agent-based models. Eur J Phys. (2018) 39:014001. doi: 10.1088/1361-6404/aa903d

CrossRef Full Text | Google Scholar

65. Dickman R. First- and second-order phase transitions in a driven lattice gas with nearest-neighbor exclusion. Phys Rev E (2001) 64:016124. doi: 10.1103/PhysRevE.64.016124

PubMed Abstract | CrossRef Full Text | Google Scholar

66. Szolnoki A. Dynamical mean-field approximation for a pair contact process with a particle source. Phys Rev E (2002) 66:057102. doi: 10.1103/PhysRevE.66.057102

PubMed Abstract | CrossRef Full Text | Google Scholar

67. Dickman R. n-site approximations and coherent-anomaly-method analysis for a stochastic sandpile. Phys Rev E (2002) 66:036122. doi: 10.1103/PhysRevE.66.036122

PubMed Abstract | CrossRef Full Text | Google Scholar

68. Helbing D, Szolnoki A, Perc M, Szabó G. Evolutionary establishment of moral and double moral standards through spatial interactions. PLoS Comput Biol. (2010) 6:e1000758. doi: 10.1371/journal.pcbi.1000758

PubMed Abstract | CrossRef Full Text | Google Scholar

69. Szolnoki A, Perc M. Correlation of positive and negative reciprocity fails to confer an evolutionary advantage: phase transitions to elementary strategies. Phys Rev X (2013) 3:041021. doi: 10.1103/physrevx.3.041021

CrossRef Full Text | Google Scholar

70. Greene JD. The rise of moral cognition. Cognition (2015) 135:39–42. doi: 10.1016/j.cognition.2014.11.018

PubMed Abstract | CrossRef Full Text | Google Scholar

71. Tomasello M, Vaish A. Origins of human cooperation and morality. Ann Rev Psychol. (2013) 64:231–55. doi: 10.1146/annurev-psych-113011-143812

PubMed Abstract | CrossRef Full Text | Google Scholar

72. Rawls J. A Theory of Justice: Revised Edition. Cambridge, MA: Harvard University Press (2009).

Google Scholar

73. Mackie J. Ethics: Inventing Right and Wrong. New York, NY: Penguin UK (1990).

Google Scholar

74. Wong DB. Moral Relativity. Oakland, CA: University of California Press (1984).

75. Rai TS, Fiske AP. Moral psychology is relationship regulation: moral motives for unity, hierarchy, equality, and proportionality. Psychol Rev. (2011) 118:57. doi: 10.1037/a0021867

PubMed Abstract | CrossRef Full Text | Google Scholar

76. Curry OS. Morality as cooperation: a problem-centred approach. In: Shackelford TK, Hansen RD, editors. The Evolution of Morality. Cham: Springer (2016). p. 27–51.

Google Scholar

77. Curry O, Mullins D, Whitehouse H. Is it good to cooperate? testing the theory of morality-as-cooperation in 60 societies. Curr Anthropol. (in press) 1:osf.io/9546r.

78. Berg J, Dickhaut J, McCabe K. Trust, reciprocity, and social history. Games Econ Behav. (1995) 10:122–42. doi: 10.1006/game.1995.1027

CrossRef Full Text | Google Scholar

79. Tajfel H. Experiments in intergroup discrimination. Sci Am. (1970) 223:96–103. doi: 10.1038/scientificamerican1170-96

CrossRef Full Text | Google Scholar

80. Güth W, Schmittberger R, Schwarze B. An experimental analysis of ultimatum bargaining. J Econ Behav Organ. (1982) 3:367–88. doi: 10.1016/0167-2681(82)90011-7

CrossRef Full Text | Google Scholar

81. Page KM, Nowak MA, Sigmund K. The spatial ultimatum game. Proc R Soc Lond B (2000) 267:2177–82. doi: 10.1098/rspb.2000.1266

PubMed Abstract | CrossRef Full Text | Google Scholar

82. Kuperman MN, Risau-Gusman S. The effect of topology on the spatial ultimatum game. Eur Phys J B (2008) 62:233–8. doi: 10.1140/epjb/e2008-00133-x

CrossRef Full Text | Google Scholar

83. Equíluz VM, Tessone C. Critical behavior in an evolutionary ultimatum game with social structure. Adv Complex Syst. (2009) 12:221–32. doi: 10.1142/S0219525909002179

CrossRef Full Text | Google Scholar

84. da Silva R, Kellerman GA, Lamb LC. Statistical fluctuations in population bargaining in the ultimatum game: Static and evolutionary aspects. J Theor Biol (2009) 258:208–18. doi: 10.1016/j.jtbi.2009.01.017

PubMed Abstract | CrossRef Full Text | Google Scholar

85. Deng L, Tang W, Zhang J. The coevolutionay ultimatum game on different network topologies. Phys A (2011) 390:4227–35. doi: 10.1016/j.physa.2011.06.076

CrossRef Full Text | Google Scholar

86. Gao J, Li Z, Wu T, Wang L. The coevolutionary ultimatum game. EPL (2011) 93:48003. doi: 10.1209/0295-5075/93/48003

CrossRef Full Text | Google Scholar

87. Szolnoki A, Perc M, Szabó G. Accuracy in strategy imitations promotes the evolution of fairness in the spatial ultimatum game. EPL (Europhys Lett.) (2012) 100:28005. doi: 10.1209/0295-5075/100/28005

CrossRef Full Text | Google Scholar

88. Deng L, Wang C, Tang W, Zhou G, Cai J. A network growth model based on the evolutionary ultimatum game. J Stat Mech. (2012) 2012:P11013. doi: 10.1088/1742-5468/2012/11/P11013

CrossRef Full Text | Google Scholar

89. Iranzo J, Floría L, Moreno Y, Sánchez A. Empathy emerges spontaneously in the ultimatum game: Small groups and networks. PLoS ONE (2011) 7:e43781. doi: 10.1371/journal.pone.0043781

PubMed Abstract | CrossRef Full Text | Google Scholar

90. Miyaji K, Wang Z, Tanimoto J, Hagishima A, Kokubo S. The evolution of fairness in the coevolutionary ultimatum games. Chaos Solit Fractals (2013) 56:13–18. doi: 10.1016/j.chaos.2013.05.007

CrossRef Full Text | Google Scholar

91. Forsythe R, Horowitz JL, Savin NE, Sefton M. Fairness in simple bargaining experiments. Games Econ Behav. (1994) 6:347–69. doi: 10.1006/game.1994.1021

CrossRef Full Text | Google Scholar

92. Krupka EL, Weber RA. Identifying social norms using coordination games: why does dictator game sharing vary? J Eur Econ Assoc. (2013) 11:495–524. doi: 10.1111/jeea.12006

CrossRef Full Text | Google Scholar

93. Gneezy U. Deception: the role of consequences. Am Econ Rev. (2005) 95:384–94. doi: 10.1257/0002828053828662

CrossRef Full Text | Google Scholar

94. Biziou-van Pol L, Haenen J, Novaro A, Liberman AO, Capraro V. Does telling white lies signal pro-social preferences? Judg Decis Making (2015) 10:538–48.

Google Scholar

95. Capraro V. Does the truth come naturally? time pressure increases honesty in one-shot deception games. Econ Lett. (2017) 158:54–7. doi: 10.1016/j.econlet.2017.06.015

CrossRef Full Text | Google Scholar

96. Sutter M. Deception through telling the truth?! experimental evidence from individuals and teams. Econ J. (2009) 119:47–60. doi: 10.1111/j.1468-0297.2008.02205.x

CrossRef Full Text | Google Scholar

97. Capraro V, Smyth C, Mylona K, Niblo GA. Benevolent characteristics promote cooperative behaviour among humans. PLoS ONE (2014) 9:e102881. doi: 10.1371/journal.pone.0102881

PubMed Abstract | CrossRef Full Text | Google Scholar

98. Tappin BM, Capraro V. Doing good vs. avoiding bad in prosocial choice: a refined test and extension of the morality preference hypothesis. J Exp Soc Psychol (2018) 79:64–70. doi: 10.1016/j.jesp.2018.06.005

CrossRef Full Text | Google Scholar

99. Nagel T. Moral conflict and political legitimacy. Philos Public Affairs (1987) 16:215–40.

Google Scholar

100. Pearce WB, Littlejohn SW. Moral Conflict: When Social Worlds Collide. Thousand Oaks, CA: Sage (1997).

101. Bartos OJ, Wehr P. Using Conflict Theory. Cambridge, UK: Cambridge University Press (2002).

Keywords: cooperation, social dilemma, public good, statistical physics, network science, evolutionary game theory, moral behavior, reciprocity

Citation: Capraro V and Perc M (2018) Grand Challenges in Social Physics: In Pursuit of Moral Behavior. Front. Phys. 6:107. doi: 10.3389/fphy.2018.00107

Received: 24 August 2018; Accepted: 06 September 2018;
Published: 11 October 2018.

Edited and reviewed by: Alex Hansen, Norwegian University of Science and Technology, Norway

Copyright © 2018 Capraro and Perc. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

*Correspondence: Valerio Capraro, v.capraro@mdx.ac.uk
Matjaž Perc, matjaz.perc@gmail.com

Download