Playing The Meta, Part 2: How To Kill Sheep

"Turkish shepherds watched in horror as hundreds of their sheep followed each other over a cliff, say Turkish newspaper reports. First one sheep went over the cliff edge, only to be followed by the whole flock, according to the reports. More than 400 sheep died in the 15-metre fall - their bodies cushioning the fall of 1,100 others who followed." - BBC
"An ant mill is an observed phenomenon in which a group of army ants, separated from the main foraging party, lose the pheromone track and begin to follow one another, forming a continuously rotating circle. This circle is commonly known as a ‘death spiral’ because the ants might eventually die of exhaustion..." - Wikipedia

Welcome to Part 2 of our series on “Playing The Meta” - playing, not the game, but the game outside the game.

Last week, we discussed common knowledge - the knowledge that everyone knows that everyone knows.

Common knowledge is important because people - like sheep, and like ants - are other-regarding. What other people do and think is of critical importance to us.

I do not mean this in the sense that we want other people to like us, or to "fit in." I mean that, at a structural level, the actions of others inform how we make decisions.

Critically, we do not think this is the case. We all believe and feel that our decisions are our own. Our reliance on common knowledge - on groupthink - is an entirely unconscious process, part of the reasoning machinery with which all of our thoughts are created. It is not something we can opt out of. It is not something we can rise above.

To be human is to be other-regarding.

We see this at play in the bystander effect. Take a few moments to watch this video:

Various pundits have decried videos like these as evidence of a fraying social fabric, but the issue is more deeply-rooted than that. These videos show that the behavior of others is a major determinant of our own behavior, even when our stated views or preferences ("I am a good person who would help someone in dire need") are in direct conflict.

Indeed, the Bystander Effect can be seen as an extension of a Common Knowledge problem. You know that I know that you know that we should all help someone in need. In this case, no one is stopping to help, and therefore the person can't really be in need...can they?

The trick about Common Knowledge games is that once there is a shift in Common Knowledge, behavior changes rapidly. This was the case in the video above. Once someone was seen (not just by one person, but publicly) going over to help, group norms - and thus behavior - changed dramatically.

Remember this fact: it does not matter what everyone knows individually (that there is someone on the ground and in distress). What matters is what everyone knows that everyone knows. In this case, it is NOT clear that everyone knows that someone is in trouble because no one is stopping.

In fact, if you want to influence human behavior, the way to do it is not with logical argumentation or even emotional exhortation. The fastest way to change human behavior is to change common knowledge.

Ben Hunt wrote about this:

"Social behavior of individuals does not change on the basis of private knowledge, no matter how pervasive it might be. Even if everyone in the world believes a certain piece of private information, no one will alter their behavior. Behavior changes ONLY when we believe that everyone else believes the information. THAT’S what changes behavior. And when that transition to common knowledge happens, behavior changes fast."

It is easy to judge these tendencies. Calls to independent thought are so common that “Wake up, Sheeple!” has become a meme. But other-regarding behaviors are adaptive; they helped our ancestors survive. We are all the sons and daughters of those with highly-developed other-regarding abilities; there is no "out" of this tendency. Indeed, we would not want to live in a world without other-regarding, without empathy, without group feeling or the ability to socially cohere.

At the same time, the Bystander Effect is just one example of the ways in which other-regarding can lead to irrational behaviors in individuals and groups - even to the death.

Take informational cascades.

Ant Mills, mentioned above, one example. Informational cascades are feedback loops in which the decisions of others convince us to disregard our own experiences.

For a wonderful example, we can turn to the excellent Information Cascades in Magic (bet you thought I wouldn't find a way to connect back to Magic: The Gathering this week, eh? You are mistaken, friend):

“Say that someone organizes an unusual tournament. Fifty of the players are given identical Blue/White decks in some new format. They are matched up against another fifty players piloting Red/Green decks in this same new format. There is one round played and no one sees the results of anyone else’s match. A person at random is asked which deck he thinks won the most matches. Everyone else is able to hear his answer, though not his result. Then a second person is asked, then a third, and so on. Each person who guesses the “winningest deck” receives a box of product. Everyone, beyond the first person, has private and public information to base their decisions on. The private information is the result of your match. The public information is what everyone else chose before you were asked.
Clearly, the first person will base his decision on the result of his match and how it played out. Now, let’s say you are the fourth person asked. Your goal is to guess correctly and win the product. In your match, Blue/White won easily. However, the first three players all select Red/Green as the deck they think won the most… What do you name?
Most people would go with the group (against their own results), which is the rational thing to do. This would produce the correct answer more often than not, but would also usually start a cascade, essentially dooming everyone else if they were wrong."

The key thing to notice about information cascades is that most of the time, they produces the right answer. It is easy to think our own ideas are better than they are; repeated feedback from the group that we are wrong is often correct.

But it is not always so. The best and most revolutionary ideas appear heretical at first. The most impactful ideas buck popular trends and go against accepted wisdom. If we discarded new ideas whenever the group disapproved, we'd still be huddled around fires and living in caves.

As Warren Buffett has said: "If it sounds like a good idea, it's too late." There is power in going against the informational cascade, in acting in opposition to "common knowledge" and in favor of what you’ve learned through your own, private experience.

Interestingly, this becomes more difficult when your chosen in-group is close-knit.

From the article above:

"The more influence a group’s members exert on each other, and the more personal contact they have with each other, the less likely it is that the group’s decisions will be wise (i.e. inbred play testing)."

As social bonds become more important to each member of the group, social cohesion will become higher-valued and begin to subconsciously influence everyone's decision making. Thus, as groups become closer they also become more insular, preventing dissenting views from being voiced and encouraging informational cascades. Close-knit and insular groups, no matter how successful, will drift towards irrationality.

This is why W. Edwards Deming, the father of statistical process control, had “Drive Out Fear” as one of his core philosophies. Without the willingness to voice dissenting opinions, all groups, big or small, are doomed to fail.

To summarize:

  • Human beings are profoundly other-regarding in our decision-making.
  • This process is unconscious, automatic, and impossible to escape.
  • These mechanisms are mostly adaptive. However, they can tend towards irrationality, particularly when groups become more more interconnected and insular.

You can do two things with this information:

1.) Understand that you, too, are subject to the influence of common knowledge.

Calling it out when you see it in your own behavior will help to lessen (not remove) its influence.

(Fair warning: calling it out in others creates immediate conflict. People do not like when these things are pointed out, and aggression often results.)

2.) Exploit it for the benefit of yourself and others.

Next week will be about that.

Best,

Dan


SOMETHING I'M READING:

A very nice summary of an idea I wholeheartedly agree with:

"The unexpected takeaway is the vast majority of people likely haven’t updated their mental model for learning in decades. (Stop and think for a moment on the last time you did this.)"

Learning to learn | K/L
The fastest way to get better at something is to start slow.