Playing The Meta, Pt.3: Power Rituals
HEY - if you're a business owner, read this.
My partner and I are leading a live workshop all about one of the most critical (and under taught) business secrets of all:
Unit profitability.
Understanding this number (and how to calculate/use it) is key to knowing how to grow your business.
Get it wrong, you go nowhere - or worse.
But get it right? All of a sudden the path forward is crystal clear.
We're going to do the whole thing in an hour. It's 47$. You'll get the recording if you can't make it. And there's a 100% no-fuss money back guarantee.
Check it out here if you're interested. Thanks!
This essay is the third and final in our series on meta-games.
In part 1, we discussed the concept of playing the meta - winning the game by focusing on the players, rather than the game itself.
In part 2, we discussed information cascades - a type of group irrationality that occurs when we mistake group consensus for truth.
In truth, I wrote those essays so I could write this one.
Agency is the belief that you can do things in the world. You give yourself permission to do great things.
Power is the ability to do those things, to create tangible results in any situation.
If you have both agency and power, all things are possible.
Freedom to act.
The ability to create change.
That's what I want for myself.
That's what I want for you.
That's what this essay is about.
Noticing irrationality brings you power. Why is this so?
Once you begin to see the extent to which our behavior is driven by cognitive shortcuts, heuristics, and unconscious behaviors, it becomes much easier to influence that behavior.
We stop wondering why our patiently constructed logical arguments have no effect. We stop wondering why things feel chaotic, why certain things make us feel bad even though they lead to good outcomes, and certain things make us feel good even though they lead to bad outcomes.
Spotting and exploiting the irrationalities around us can make you rich.
It can make you influential.
Or you could simply retire to a lake house and read in peace.
The point is not what you decide to do with this power. That is your own choice to make.
The point is that you are surrounded by forces that have already internalized all of this.
The institutions of political, religious, cultural and social power are not operating under the delusion that we typically make rational decisions following a long process of deliberate thought.
It is not and was never a question of "will someone exploit our tendencies towards irrationality?"
That came to pass long ago.
The only question is whether you will take that power for yourself - and the responsibility that accompanies it.
In parts 1 and 2 we established:
- The most powerful way to win is to zoom out of the context of the game itself and see the context in which the game is situated. In essence, to play the players, not the game. We call this the "meta-game."
- People are profoundly other-regarding. Our behavior changes not when what we know changes, but when our common knowledge (what we "know" that others "know") changes.
- One mechanism that can cause in-groups to mistakenly conclude certain strategies are the best when they are not is information cascades.
- These behaviors are automatic, unconscious, and inevitable.
Understanding this is the meta-game.
In any conversation, transaction, or relationship, there will be the text - the content of what is said, the rules of the game, the state of play - and there will be the subtext - the larger context in which the text occurs. For human relations, this subtext is already and always what we have been discussing: common knowledge games, other-regarding behavior, etc.
Noticing a new level of subtext that falls just outside of a predominant way of understanding the world is an enormous source of power.
The fact that it will be universally derided and ignored when discovered makes it even more so.
Take Cate Hall's discussion of becoming a successful poker player as an example:
The idea of finding real edges, as contrasted with "eking out wins by grinding harder than everyone," first clicked for me when I started playing poker. Poker in the modern era is an extraordinarily competitive game, and even 8 years ago pros were spending nearly as much time studying as they were playing, using solver models to seek out tiny mathematical advantages. At the same time, a massive edge was available in the form of physical reads, but almost entirely ignored. (I know an example would make this more compelling, but I'm sorry, it's like explaining a magic trick.)
Two friends and I maniacally studied reads together, and we all had out-of-distribution results. But when we'd tell other pros what we were doing, the response from most was "nuhuh, that's not a thing." They weren't willing to consider the possibility that reads were valuable, maybe because they didn't want to feel obligated to study them.
The reaction she describes is not unique to poker players.
My absolute favorite treatment of this phenomenon is the wonderful Moneyball by Michael Lewis. The entire book is about the exploitation of irrationality.
(I haven't seen the movie, but the book is absolutely worth the price of admission. Even if, like me, you have very little interest in baseball).
Moneyball follows the Oakland A's, the "poorest team in baseball" at the time. Many believe that baseball is a sport dominated by rich teams who can afford to pay high salaries and recruit the best players. Given the A's lack of financial resources, one would expect them to do poorly; and yet, season over season, the opposite occurred.
Moneyball argues that this track record was due to the influence of General Manager Billy Beane. Beane believed in using statistics to value players. He looked for characteristics that 1.) had an outsized impact on the outcome of the game, and 2.) were systematically undervalued by other teams. This approach allowed him to find "deals" - players who could help them win, but whose salaries were lower than average.
(This wasn't Beane's invention - many had been arguing for a similar approach for years. Beane was just one of the first to listen.)
Lewis argues that baseball had been dominated by a kind of superstitious belief in a player's "intangibles." Baseball scouts brought years of experience to the evaluation of players, often getting a "gut feeling" about whether a player would be successful. They relied on a kind of subconscious pattern-matching based off of hundreds of other ballplayers they'd seen.
That these feelings did not seem to correspond to actual outcomes had little effect on the practice. Beane sought to exploit this tendency.
Beane's approach was successful but did little to convince his detractors. One of the recurring themes of the book is just how resistant people are to having their mental models of the world updated, even when the results are staring them in the face. Over and over again, people tell Beane and his advisors that, while statistics are all fine and dandy, they don't get at the heart of what makes a baseball team successful. Every single one of Beane's detractors has personal experience that tells them they are correct. Sometimes, decades of experience. They have seen the evidence with their own eyes.
One of my favorite quotes from the book:
“I just don’t see it,” says the vocal scout.
“That’s all right,” says Billy. “We’re blending what we see but we aren’t allowing ourselves to be victimized by what we see.”
"Victimized by what we see."
I highlighted that one, then I underlined it.
Then I wrote power in the margin.
We all overestimate how representative our own experiences are.
What we see represents "reality" to us. We lose track of how non-representative a single sample can be, especially when that "sample" is our own experience.
This is how information cascades occur. We see someone win; we see other people see them win; we see other people discuss it. The group concurs. All evidence points in one direction.
Once we have internalized this, it is extremely hard to come to believe otherwise. Data be damned.
A discomfort with looking foolish holds these tendencies in place. None of us like to feel like the "odd one out." If we invest our money in the same funds as everyone else and we lose it all - well, that's OK. We can curse the markets, curse our bad luck, curse everyone but ourselves.
But what if we invest our money differently and then we lose money? What if everyone else makes money?
Well. That feels a lot like humiliation. Humiliation and death aren't far apart.
It’s better to lose it all the way everyone else lost it all.
It’s better to be wrong in a group.
It’s better to be safe.
It’s better to be irrational.
You think this way.
I think this way.
But if we see it, if we can distinguish it...
Maybe, just maybe, we can choose to be right alone rather than be wrong together.
Power goes to those who can spot the places where irrationality has led people away from a winning strategy.
These places are not rare. In fact, they are everywhere.
Spotting bias and irrationality gives you an edge.
But to make that edge your own, you have to be willing to be wrong.
Cate Hall (the poker player from earlier) calls this "the moat of low status":
The moat of low status is one of my favorite concepts, courtesy of my husband Sasha. The idea is that making changes in your life, especially when learning new skill sets, requires you to cross a moat of low status, a period of time where you are actually bad at the thing or fail to know things that are obvious to other people.
It’s called a moat both because you can’t just leap to the other side and because it gives anyone who can cross it a real advantage. It’s possible to cross the moat quietly, by not asking questions and not collaborating, but those tradeoffs really nerf learning. “Learn by doing” is standard advice, but you can’t do that unless you splash around in the moat for a bit.
You have to be willing to ask the "dumb question" (which is usually "how do we know?")
You have to be extra-willing to spot your own irrationalities and foibles. Knowing about them ahead of time does not make you immune.
You have to be willing to look stupid.
Even worse:
You have to be willing to feel stupid.
Harder than it sounds.
So:
How do we do that?
For one, find or build a community of people committed to rationality and willing to tell you the truth.
To return to Information Cascades in Magic:
Typically, one’s ideas are not as good as one thinks. This means if you are consistently choosing on idea that ducks the crowd, you are probably putting yourself at a disadvantage. Of course, people do have good ideas (sometimes get to have them). The key is to find a group of playtest partners who recognize the difference between your good and not-so-good ideas, and who are willing to say so. Also, be careful to sort among your friends’ strong and weak ideas. Long story short, if you typically [look up strategies online], you should innovate more. If you typically go rogue, you should imitate more (both assume your goal is to win more).
Secondly, learn about cognitive biases, common knowledge, game theory, etc. You cannot distinguish something without having some idea that it is there; our biases are not immediately visible to us. The books are out there: read them.
Finally, make the process for checking the thought processes of yourself and others for these issues systematic. This means a checklist or deliberate process of self-analysis.
Apply this process liberally. Note the blind spots of others - these are where the edges lie.
The bad news:
It's a lot harder than it sounds. It takes discipline. It takes work.
The good news:
Once you know it's there, you will start to notice it everywhere.
Now you know it's there.
Go play the meta.
Yours,
Dan
Something I'm reading:
I highlighted the living daylights out of this one. Particularly pertinent given my tendency to measure, manage, and - well - probably over-manage so many things in my life.