August 11 , 2017 - Fort Russ News -
Op-ed by Denis Churilov
There’s a classic psychological experiment known as the Prisoner’s dilemma. In the most basic form, it goes as follows. Two participants are assumed the role of prisoners. Each is given a choice: either to testify against his "partner in crime” or stay silent.
1) If both participants stay silent, they both get 3 years in prison.
2) If they both choose to testify against one another, they both get 5 years in prison.
3) However, if one chooses to stay silent but the other one betrays him by testifying, the prisoner who testified is set free while his betrayed "partner in crime" receives a 10-year prison sentence.
The participants are not allowed to communicate to each other during the experiment, so they can’t coordinate and have to guess the other player’s intentions. The game is played in a limited number of rounds, usually 10.
By the logic of mutual trust, it is within both participants’ interests to cooperate and stay silent to get the shortest possible jail term.
However, as repeated studies have been showing for decades, people almost always end up betraying one another by testifying against their “partner in crime” in the early stages of the game, so, eventually, all the following rounds end up playing out as "testify-testify”, with participants getting 5 years in prison each.
So, by the logic of game theory, the most optimal strategy in the game of Prisoner’s dilemma is to betray your friend by testifying against him, because, at one point or another, he is going to betray you anyway, so altruism becomes maladaptive, completely destroying the mutual trust and cooperation.
That is in a controlled laboratory environment. However, in the real world, the are a few factors that change the game completely.
First of all, unlike the laboratory conditions, the real world psychological/trust games are not limited by the number of possible rounds, so people have to adapt and implement long-term strategies indefinitely. Then there is also a huge factor of the open world, where, unlike the laboratory environment with only two players who can’t communicate with one another, people do talk to various other people, communicating information about who betrayed them, and such, so all the real world players have reputation that they are interested in maintaining at a certain level.
Therefore, as mathematicians have established, the most optimal strategy for the real world society (and for open biological systems in general) is trustfully cooperating with others on the first occasion, and then act by simply mirroring their responses to your actions (i. e. if you do them a favour and they return it, you keep cooperating with them by exchanging favours; if they betray you at any given point, you stop doing them favours and suspend cooperation).
Interestingly, societies (and biological systems) that have higher rates of mutual trust and cooperation between members are more sustainable and usually have advantage over other societies/systems.
While it’s impossible to have a social/zoological system where everyone is trusting and altruistic (due to mere laws of statistics and normal distribution), it is still possible to have cooperator/betrayer ratio at an optimal and sustainable level.
When cooperator/betrayer ratio in the system shifts closer towards the statistical domination of “betrayers”, the system collapses or undergoes through painful reconfiguration until a stable ratio is established (given that the system will be able to survive the crisis).
But the general rule remains the same - systems where members trust each other have higher chances of surviving and propelling through in time. That’s how the early prehistoric tribes were able to get together and survive better through cooperation, dealing with their enemies and environmental challenges, eventually learning how to trust and cooperate with other tribes and form larger societies that gave birth to the Human Civilisation.
Interestingly, many stable religions have those game theory strategies imprinted in them. As such, Christianity, at the basic level, advocates that people should treat each other altruistically (highly moral people tend to trust others, and then act according to the responses their altruism met with).
Mutual trust AND cooperation almost always lead to better long-term outcomes. For everyone. This is shown by both theory and observable reality (in human societies and animal groups alike). Unfortunately, there are still huge volumes of philosophical theories that advocate the principles of Machiavellianism and other rudiments of the early Modernity in politics and international relations.
Follow us on Facebook!
Follow us on Twitter!
Donate!
from Fort Russ http://ift.tt/2uQaSnr
Op-ed: Game Theory is preferable to Machiavellianism in International Affairs - Like This Article
0 Response to "Op-ed: Game Theory is preferable to Machiavellianism in International Affairs - Like This Article"
Post a Comment