Does the whole world agree with me? Or is it just my filter bubble?

Satyajit Rout
3 min readJun 19, 2021
Credit: Beth Kanter

Updating beliefs to accurately reflect the world happens best through direct experience. Fire is hot. Best way to know this is to touch a naked flame. But we do not get the chance to learn everything directly. We learn a lot more via the communicated experiences of others. It is through words, for example, that both Martin Luther King Jr. and Adolf Hitler delivered their worlds to their followers. What does this leave us at risk to?

Annie Duke in Thinking in Bets lays out the process of how we form beliefs.

We hear something.

We believe it almost immediately.

We rarely vet it (only if we have time and inclination).

Not just that there’s little by way of pre-screening (no reference checks, no triangulation) before we elevate something we see/hear to the status of a belief, our post-screening mechanisms are rigged too. Even when we are exposed to disconfirming evidence, we tend to undermine its quality and quantity and spin a narrative supporting our already formed beliefs.

Our automatic thinking system is built for efficiency over accuracy. It is a core functionality that has served us well. Developing skepticism (could it just be rustling of trees?) to counter this automatic belief formation (there’s a bear in the bushes!) would not have served early man much. So, our system of believing in things implicitly is embedded deep in us. We cannot just take it out of us. But it has always been like this. Why is this a problem now?

It is not for most of us for most of the time. But in a digital world without gatekeepers to vouch for the quality of information passed on and without a screening mechanism in our heads to vet the veracity of information, extreme views bubble up to the surface to distort reality. And special times (pandemic, elections) produce special distortions (polarized views, fake news). They nudge us to upgrade our automatic belief formation system into an automatic belief reinforcement system.

We hear something.

We believe it.

We reinforce it, goaded by algorithms and our own confirmation bias.

Charlie Munger said: “I never allow myself to have an opinion on anything that I don’t know the other side’s argument better than they do.”

Charlie maintained this before personalized algorithms were what they are today. The best way to break the process of automatic belief formation is to do the work required to have an opinion. Instead of spending an evening arguing lazily (something I have been guilty of), we could spend that time gathering evidence from multiple sources, developing a systems view of the subject, and playing the devil’s advocate.

Gathering information and learning from it is not as easy as it seems. As I most recently discovered while going down the COVID lab leak rabbit hole, we pay for information with our attention. Nobel Laureate Herbert Simon said “A wealth of information creates a poverty of attention” long before the Internet. What surfeit of information does is overwhelm us and make it difficult for us to re-evaluate our position, if we lack the means to efficiently separate the signal from the noise.

One way to gauge the quality of information is to consider the distance from the source. The more the degrees of separation from the source, the fewer the nuances. And the lighter information is on nuance, the farther it travels and the more easily it is consumed. How often have we tried to push forward a view picked up from a headline only to be confounded by the first question from our audience.

I would be interested in a recommendation service that, imagining my opposite alter ego, serves me disconfirming evidence and forces me to argue with myself. I would then go with whichever side wins that internal argument, or be comfortable enough to be in a state of discovery. If we can come to terms with the possibility that we all have blind spots and that truth-seeking is a process, it may just be easier for us to admit to ourselves and to the world “I’m not sure.”

Thanks to Atul Sinha for reading drafts of this.

--

--

Satyajit Rout

I write about decision-making, mental models, and better thinking and things in between