Saturday, December 23, 2006

The Evolution of Cooperation

If I asked you the book that made the most impact on the way you looked at the world, you’d probably have as much trouble as I would. But if I asked to name one in the top 5, without the singling out, maybe it’d be easier. It would for me. And this is one I’d say it about—Robert Axelrod’s The Evolution of Cooperation. And now it’s back out in a revised edition so you can likely find it at your local bookstore if you don’t want to go online.

Axelrod had already established himself as an international relations specialist, especially with this applications of cognitive schema and mapping. But he’s made his most lasting mark with E of C, one of the first and still most important computer simulations, back in the day when desktop PCs were still pretty much sci-fi and being written off as unworkable by IBM.

What Axelrod did was perversely simple by today’s standards. He sent out a request to a range of scholars across many disciplines to have them submit what they thought would be the strategy that would pile up the most points in a long-term game of “Prisoner’s Dilemma.” Then he ran them against each other to determine what the most successful strategy was.

Don’t know “Prisoner’s Dilemma”? I could just tell you it’s a game played with weird enthusiasm by a bunch of academic nerds seeking esoterica for publication. But, less snidely, the game basically posits that you and your buddy have been caught dead to rights on a lesser offense, but the cops believe (but can’t prove) you’ve done worse together. To prove the latter, they have to get confessions. Now, being smart interrogators, they separate you two and offer you a deal. You finger your partner for the crime—he does the full time for the worser offense, you walk. Of course, he’s being offered the same deal, so, if you both rat each other out, you’re both doing harder time. On the other hand, if you both keep quiet, you’ll get a lesser sentence than if you both snitch. What do you do?

Axelrod assigned values to the four options—you “defect” and your partner “cooperates,” vice versa, you both snitch, or you both keep quiet—and played out the strategies. It doesn’t take a genius to figure out that, short-term, defection will be more popular, having the biggest payout and all if your partner is stupid enough to trust you. But, over the long-term, especially when you don’t know when the game will end, is it still smarter to be a jerk all the time?

No. Turns out that defectors bring on defectors, and the harsher penalties. But then, cooperators don’t do well either because, known for their predictable and certain cooperation, the defectors can dump on them at will and still pile up the big gains. Which leads us to the simple strategy for winning long-term—that is, reciprocity or TIT FOR TAT. Which is exactly what it sounds like—whatever is done to you, do it back. Defect, defect, defect until the idiot figures it out. Once you’re both on “cooperate,” you’ll get consistent points even if they aren’t the highest possible per turn. And groups of interacting cooperators, having demonstrated their willingness to defect when defected upon, pile up the points.

Now, Axelrod later found that a little mercy sometimes worked a little better(if the partner mistakenly defected or only did it once in a very great while, like your sad, sad lovelife), but overall TIT FOR TAT was an amazingly successful strategy. Change the rewards for defecting or cooperating, or course, and the game can change dramatically. But the message was essentially this: Do unto others as you would . . . well, you get it. And it was scientifically proven.

The implications for criminal justice should be obvious. Transgressions must be met in proportion to the harm done, and defectors must be rewarded for choosing to cooperate. The costs of defecting must be made higher than those of cooperating or certainly of doing nothing in response at all. This speaks to the punishment aspect. But first, we have to guarantee a definite response. Certainty comes first, then severity (which is as crim theory also has it), meaning policy that short-changes proactive prevention or apprehension and accountability in the name of perhaps possible later punishment will be short-sighted and less effective. Then, once the proportionate response has been made, we must reward the prodigal defectors for their return to cooperation. If we don’t, if we keep them from jobs, from voting, from living among us, then there are no points in cooperating, are there? And we know how to spell “recidivism.” Yes, you do prison, but only as demonstrably necessary to counteract the defection and as part of a holistic crim just policy strategy.

Axelrod’s book, and subsequent sequel, The Complexity of Cooperation, have far more depth and nuance than I can relate here, including the answers to the objections you undoubtedly came up with as you read my summary. And his major concern was international relations, not interpersonal ones. But the lessons apply to any interactions in which gains and losses are distributed based on practicing reciprocity or on asserting superiority—parents/children, bullies/bullied, sweetie/sweetie, crook/cop, boss/employee, prosecution/defense, political party/political party. A lot of great analysis has connected Axelrod’s dots in the last 26 years. There’s a lot more to be learned and applied. It’s good to see the book still has an audience. I hope you’ll be in it.

No comments: