The other day, someone in my Inner Circle mentorship program asked me the following:
“Victor, how do you manage your information intake to avoid the ‘filter bubble’ and without falling prey to crackpot thinking?”
Before I share what I do personally and why I do it, I thought I’d start by explaining what happens before and at the start of a McKinsey engagement. (Humor me on this one… as it is very much related to broader societal issues).
Before McKinsey is hired, the many stakeholders in a company (various executives and the board of directors) all have endless opinions on what direction a company should go in.
Should we invest more in sales or R&D? Lots of opinions on this.
Should we focus on customer segment X or Y? Lots of opinions on this.
When the debate can’t be settled and the opinions don’t seem to converge, someone calls McKinsey to help.
The first week or two of most new engagements focuses on compiling something called a “fact pack.”
This is a presentation filled with charts, and it quantitatively defines the facts of the current situation.
A fact pack might have the following slide headlines:
- This business has $100 million in sales and has been growing 20% per year.
- This business has profit margins of 15%, which have been declining for the last three years.
- Customer satisfaction ratings have gone from 90% satisfied to 60% satisfied over the past 36 months.
A fact pack might have 50 to 100 slides that really just define the status quo.
The purpose of doing so is to make sure all stakeholders are informed and operating from the same identical base of facts.
Sometimes, various stakeholders are at different levels of being accurately informed. Some people perceive the business based on their memory of the state of the business from a year ago. Others just got yelled at by a customer and have a skewed view of the business based on that one interaction. Another might have suffered a huge financial loss due to a drop in stock prices, even though sales and profits have actually increased.
Most complex decisions involve agreeing on facts, doing some analysis around hypotheses, and deriving a conclusion based on the analysis. That entire process is completely circumvented when stakeholders have widely divergent understandings of the business.
For example, if half the audience thinks the business is growing and the other half thinks it’s shrinking, it’s really difficult to have a productive conversation, as the audiences are operating from two completely different mental starting points.
This is why you always want to start with… the facts.
In terms of how I manage my information intake, I do a few things.
First, I do a lot of primary research… to get the facts.
I find that so much of what passes for journalism today is really just opinions in disguise.
I have been going to primary source documents a lot more lately. If I’m reading articles on a controversial supreme court ruling and the articles all contradict each other, I will go to the United States Supreme Court website, download a PDF of the actual court ruling, and read the thing for myself.
More recently, there have been a lot of opinions floating around regarding the COVID-19 vaccine efficacy and safety. I downloaded the 53-page Pfizer Emergency Use Authorization request submitted to the Food and Drug Administration. I did the same for the 84-page Moderna Emergency Use Authorization request.
(I use a medical dictionary a lot.)
When the pandemic first started, news media reports ranged from “This might wipe out life on Earth” to “It’s just like the flu.” I noticed the lack of convergence and opted to download the mortality and infection trajectory data published by the Chinese government. I built my own forecast model to do a sensitivity analysis around a few key assumptions.
This analysis led me to withdraw from the outside world a few weeks before the U.S. federal and state governments started doing “lockdowns.”
I also run my thought process by several of my acquaintances, who often know more about various topics than I do. So, I also leverage my network.
[This approach is still vulnerable to source data being incorrect. I can’t fix that risk. What I can do is eliminate the risk of a journalist misunderstanding the source data and presenting me with a flawed conclusion or an opinion disguised as a conclusion.]
Finally, I have a few friends on social media that pretty much disagree with everything I ever write. I find those friends useful to check my own thinking and biases. It’s useful to hear contradictory viewpoints.
Now, a practical implication of all this is that doing primary research takes a lot of time!
So in practice, I don’t do it very often, as it’s not worth the time investment. I do it either because I need to make a big decision (like whether to lock myself down or not, or whether to receive an injection of a new vaccine or not) or because I’m bored one day and decide to go on an intellectual adventure.
Most social media noise is just that: noise. A lot more of news media today is noise too. In practice, there are very few news articles or social media posts that will change a major decision I make in life.
For example, what I eat for lunch is pretty much never impacted by what the U.S. government does or does not do on a particular day.
My financial plan for the year is never influenced by whether or not some celebrity did or did not have some secret love affair.
How I raise my kids is never influenced by a particular meme and how many likes it might get.
In short, a lot of noise doesn’t actually matter. It just consumes a lot of mental energy and time.
Finally, after working at McKinsey and spending my entire career in informationally “noisy” environments (inside client companies), I’ve gotten very good at recognizing the difference between facts, opinions, and conclusions.
Facts are things that are objectively true and typically can be verified. Opinions are thoughts people have about the facts. Conclusions are logical derivations from the facts.
One of the most important things I learned at McKinsey was to never confuse opinions with facts or conclusions. Never. (This is how you get fired at McKinsey.)
The key to telling the difference between an opinion versus a conclusion is to ask to see someone’s reasoning.
If the reasoning doesn’t exist, that’s not a conclusion; that's a belief.
There’s nothing wrong with beliefs. I have many beliefs. However, I try not to mistakenly see my beliefs as facts or logically derived conclusions.
If the reasoning does exist, then I ask someone to explain each step of their reasoning. I look to see if each step makes logical sense or if there’s a huge leap in logic.
If a “conclusion” is predicated on a belief, it’s not really a conclusion… it’s actually just a belief.
The key to using critical reasoning to make good decisions is to minimize how much of your decision is based on a belief. To the extent that there isn’t as much data as you’d like (which is nearly always the case), I try to mentally label my beliefs as “assumptions.”
By doing so, I acknowledge that I have uncertainty around a particular step in my reasoning.
Down the road, if new information emerges, I try to revisit my initial assumptions and see if I can replace them with newly discovered facts. If I can, I recheck my reasoning to see if the logic still holds.
If it doesn’t, I revise my conclusion based on newly available data.
In my company, when my staff has a proposal they want to make, I have them write me a memo. I don’t think we’ve ever had a PowerPoint presentation for any internal decision making…. only memos.
I like memos because they force the author to identify key facts and explain their reasoning. It is far too easy to have a massive PowerPoint slide presentation with absolutely no reasoning what-so-ever.
You can still attempt to do that in a memo, but it sticks out like fingernails clawing a chalkboard.
When in doubt, start with the facts. It’s a good rule of thumb.