This is a piece I wrote a couple of weeks ago on Whistle Stopper. I’ll probably recycle some of my other “greatest hits” there as entries here over the next few days.
In honor of the late John von Neumann’s 100th birthday, here’s a little review of some things we should have learned from game theory – a field he practically founded.
The best-known lesson from game theory has to do with the Prisoner’s Dilemma. It’s a very simple game, actually invented by Albert Tucker, which goes like this:
Tucker began with a little story, like this: two burglars, Bob and Al, are captured near the scene of a burglary and are given the “third degree” separately by the police. Each has to choose whether or not to confess and implicate the other. If neither man confesses, then both will serve one year on a charge of carrying a concealed weapon. If each confesses and implicates the other, both will go to prison for 10 years. However, if one burglar confesses and implicates the other, and the other burglar does not confess, the one who has collaborated with the police will go free, while the other burglar will go to prison for 20 years on the maximum charge.
The interesting thing about the scenario is that there’s a temptation to “defect” (rat on your partner) but it only works if the other guy doesn’t do it. If both defect, the result for both players is worse than if they had both “cooperated” (remained silent). In other words, selfish opportunism screws things up for both players. This apparent paradox was studied extensively by many people, including John Nash (of A Beautiful Mind fame, who won a Nobel for it) and Robert Axelrod. Axelrod’s specialty was the iterated prisoners’ dilemma, in which two or more players play multiple rounds of the basic prisoners’ dilemma, remembering history and trying to maximize their results according to different strategies. As described in his book The Evolution of Cooperation the selfish opportunistic strategies are not optimal. The best strategies over the long term are those that encourage mutual cooperation, not pure competition.
Too abstract, you say? Not applicable to real people in real situations? Consider gridlock. It’s a very common and obvious example of people each trying to take advantage of others’ cooperative nature, and screwing themselves and everyone else in the process. Four people who approach an intersection in a cooperative frame of mind will all get through it faster than four people who each try to rush the light and end up creating a traffic jam.
Still not good enough? Consider these examples, then:
- Imagine two companies, competing in multiple markets. One (Microsoft) uses bundling or predatory pricing to drive another (Netscape) out of a market, or perhaps out of business altogether. Once that is achieved, the monopoly holder is able to prosper despite inferior products or service. The more innovative competitor is gone, and the market suffers.
- Imagine several airlines, each charging about the same price for tickets. This works great for them, so long as nobody “defects” by undercutting the others and trying to gain market share. As soon as someone does that, a price war ensues and they’re all worse off than if they had all settled for the profits from the cooperative arrangement.
Yes, these things really happen. Anybody who doesn’t see that is blind. A hasty interlocutor might at this point say that the second example shows how the selfish urge can defeat monopolistic schemes, but there’s a problem with that. The people who run companies know this stuff. Some of them knew this stuff before there was such a thing as game theory, and the others stayed awake in class. They know how to use game theory to their advantage, to profit from the first scenario and stay out of the second. They rely on people who didn’t stay awake in class to spread the dogma that lets them continue making money by means other than providing superior products and services. It suits their purposes very well indeed to have foot soldiers who reassure everyone that the “invisible hand” will take care of everything and there’s no reason for government to get involved. Another one of Axelrod’s results was that the only time opportunistic strategies do work is when there’s a high percentage of naive always-cooperate players for them to fleece.
So there you have it. From very basic game theory to real-life economics, the same principles hold true. Some people would have us believe that something will be magically different if we just turn this knob or push that button. That’s bunk. No rule is universal, but when a rule holds true across a broad variety of situations it’s less logical to believe it will break down in a new situation with no significant differences than that it will continue to hold true. Whether the proponents of such an illogical and counterfactual theory are simply naive or hope to benefit by convincing others of something they themselves don’t believe is unclear, but nobody who has studied game theory would ever be taken in.
Rest in peace, John von Neumann. Your work did make a difference.