News Release

Researchers develop the first model to capture crosstalk in social dilemmas

Peer-Reviewed Publication

Institute of Science and Technology Austria

Crosstalk in Social Dilemmas

image: Previous games influence decisions in other independent games through crosstalk. view more 

Credit: (c) by IST Austria, 2018

The idea that previous interactions can affect unrelated future decisions might seem obvious: the stranger in front of you pays for your coffee, and then you pay for the stranger behind you. You've had no interaction with the latter, and no reason to do them a favor, but you do it anyway. Similarly, if a friend refuses to help, you might be less inclined to help the next person who asks you for something. These are both instances of crosstalk--previous interactions affecting unrelated future decisions--and though this notion might seem natural, it had never before been incorporated into simulations of groups engaging in repeated social dilemmas. A new framework developed by computer scientists at IST Austria and their collaborators at Harvard, Yale, and Stanford has changed that, and enables the analysis of the effects of crosstalk between games.

The prisoner's dilemma is a classic example of a social dilemma--that is, a situation where both people would be better off if they cooperated than if they both defected, but there is still some incentive to defect. When social dilemmas are repeated, people develop (usually subconsciously) a strategy that dictates when they should cooperate, and when they should defect. Researchers use computer simulations to study repeated social dilemmas or "games" by assigning virtual players different strategies, and have established which strategies lead to the development of cooperation, and how stable the resulting cooperative situations are. Successful strategies include, for instance, "tit-for-tat"(I start by cooperating, and then I'll do whatever you did last) or "win-stay, lose-shift" (I start with cooperation, then I'll keep doing what I'm doing until I lose).

However, in all of these previous studies, scientists have assumed that a player is only interacting with one other player (i.e. Bob only ever plays Alice), or that a player's decisions in one game is completely independent of their decisions in another game (i.e. Bob's games with Alice have no effect on his games with Caroline). These assumptions do not necessarily apply to real-life social dilemmas, however: humans are often involved in many simultaneous games, and interactions with other players spill over into other games. In other words, these games are subject to crosstalk.

Now, a team of researchers has developed a new framework to address this limitation in the theory, and allow for the quantitative evaluation of the effects of crosstalk on cooperation dynamics in a population. The team includes co-first authors Johannes Reiter, IST Austria alumnus and current Stanford Instructor, and Christian Hilbe, a postdoc at IST Austria, as well as Professors David Rand, Krishnendu Chatterjee, and Martin Nowak, of Yale, IST Austria, and Harvard, respectively. Their various expertises and perspectives, including evolutionary dynamics, game theory, psychology, and economics, all played a role in helping to create the new model.

In a given simulation, each virtual player has a memory of the games played with each of the other players. In previous models, a player would review their past with their current opponent, and decide on a course of action based on this past and their game strategy. In the new model, there is some chance that these memories will be replaced with the memories corresponding to a third player. This method of encoding crosstalk is in fact general, and accounts for all the many varieties of crosstalk, be it simple human error (mixing people up) or paying-it-forward (you remember your good experience) or some other type. Moreover, it can be applied to any societal network--from a group where everyone knows everyone else to a circle to a random mess of connections.

For Christian Hilbe, this development was exactly what the framework needed: "When modeling repeated games, you always have certain phenomena that you want to describe. For me, it never felt as though previous models were complete. When we introduced crosstalk, it was as if everything snapped together--this is the model we should be using."

Human error has previously been considered in simulations of repeated social dilemmas. The difference here is that while these errors affected only the repeated game in which they occur, crosstalk causes ripple effects across the entire population: "When crosstalk is introduced, suddenly you're not playing against a single person--you're playing against everyone you are connected to, the whole society," explains Krishnendu Chatterjee.

This results in cooperative and defective behavior spreading much more easily--even a single defective player can cause the complete breakdown of cooperation in a society, if the other players are not sufficiently forgiving. But crosstalk also necessitates strategies with the "correct" level of forgiveness: too harsh, and you end up with a society where no one cooperates, too generous, and defection can also spread as players learn to take advantage of other players. Crosstalk moreover hinders the evolution of cooperation: the authors implemented an evolutionary model, and found that crosstalk decreases the number of different starting societies that end up in stable cooperative states.

Their paper, published today in Nature Communications, presents an interesting message for our current society. Johannes Reiter explains: "The presence of crosstalk means that players must be more forgiving, especially in a network that is highly connected. A harsh strategy for cooperation, such as tit-for-tat, is particularly disastrous in this environment."

###


Disclaimer: AAAS and EurekAlert! are not responsible for the accuracy of news releases posted to EurekAlert! by contributing institutions or for the use of any information through the EurekAlert system.