I receive many emails asking if I have a framework for XYZ kind of case.
Often XYZ is something a little unusual, such as a non-profit case, a human resources organization design case, a manufacturing process improvement, or IT vendor selection case.
While my focus is on strategy consulting cases, I still get these questions, mostly because there is an absence of information online about how to solve these other kinds of cases.
When I get these questions, I can tell just by how the question is phrased that the person asking the question does not grasp something very fundamental about how to approach solving a case.
There seems to be this obsession with picking the right "framework."
This obsession is, in my opinion, misplaced.
When I get such a question, my instinctive thought is to reply back and ask essentially two questions, the answers to which will provide the "right" framework for any case -- strategy, human resources, manufacturing, or technology.
The two questions are:
1) What is your hypothesis?
2) What data do you need to disprove it?
In other words, in terms of importance -- the hypothesis comes before the framework.
Now sometimes it does make sense to use a standard framework just to learn a little more about the client situation... and then form a hypothesis about 5 - 6 minutes into the case (which often involves revising the framework to better fit the hypothesis).
But in that situation, the only value of the framework is to ask a few high probability questions that are likely to uncover some information that would allow us to form a hypothesis, based on some informed intuition.
So remember this... (and what I am about to say is worth writing down, circling, and putting a few stars next to).
The hypothesis is more important than the framework...
This is true for the simple reason that it is not possible to determine the correct framework until after you have determined your hypothesis.
(And just to clarify... the early strategy of starting with a standard framework is really a strategy of intentionally picking a framework that has a good chance to be wrong... for the sole purpose of figuring out a hypothesis which then allows us to determine the correct framework.)
While this point is very fundamental to case interviews, I find it does seem to take awhile for people to really master this concept.
Sure, everyone understands it intellectually, but they often have not mastered the concept to the point they will automatically use it in an interview out of habit.
I find as a teacher, it is often helpful to reinforce a key point by teaching the point in a different way.
(Incidentally, this is the subtle difference between being repetitive.... saying the exact same thing, the exact same way... vs. "reinforcing a point," which is making the exact same point, but in a different way each time.)
So let me take another shot at explaining why the case study interview process is not a framework-driven approach, but instead is a hypothesis-driven one (yet another point worth writing down, circling and putting some stars next to).
My oldest daughter is learning science for the first time in school. For reasons that are mysteries to me and my wife, she is just in love with science (despite the fact both of us have very little interest in the topic).
Instead of reading children's stories which she finds boring, she prefers to read science encyclopedias.
Well one of the concepts she is learning in her school is the idea of a "hypothesis"... yes the same concept that applies to case interviews.
In practice, they are learning what a hypothesis is before actually becoming familiar with the word "hypothesis" itself.
What happens in class during science time is the teacher prepares some kind of science demonstration... something cool. And before the teacher starts the demo, he asks, "What do you think will happen?"
Everyone ventures a guess, and the demo is perfectly set up to "prove/disprove" everyone's best educated guess.
And this is where things get interesting. The kids get so excited about the demo, and after they get "wow"ed, they ask the teacher, "Wow, that was so cool... what if we changed X in the experiment... what would happen then?"
The teacher often really does not know because he has never performed that variation of the experiment before. So he says to the class, "Geez, that's a good question. I don't know what will happen. What do you think will happen?"
Everyone yells out their best guess (i.e., their hypothesis) and the teacher changes the experiment and re-runs it and generally proves half the hypotheses of half the class, while disproving the hypotheses of the rest.
For example, I recently helped my daughter conduct an experiment in class for her friends. It was the "Mentos and Diet Coke" experiment.
If you are not familiar with this experiment, go to YouTube and search for it... it is very cool.
Basically you drop some Mentos mint candies into a 2-liter bottle of Diet Coke, and the Coke explodes and shoots out of the bottle, creating a 25-foot geyser of Coke.
So when my daughter practiced the experiment at home, she did it by placing seven mints into the Coke bottle (someone had recommended this online as the optimal number).
So at school, after running the experiment, someone asks, "What if you put more Mentos candies into the bottle? Will it shoot even higher? Will there be even less (residual) Coke left in the bottle?"
My daughter did not know the answer. I did not either. So I suggested, "Why don't we just try it again and see what happens?"
Every child guessed what they thought would happen (i.e., they all stated their hypotheses) and my daughter re-ran the experiment, this time with 11 Mentos candies (the maximum that the candy holder we had screwed to the top of the bottle could hold).
And guess what happens?
The height of the geyser and the amount of Coke left in the bottle was the same with 11 candies as it was with seven.
Huh... isn't that interesting?
Most kids had hypothesized the opposite of what actually happened.
Maybe the Mentos hits a point of diminishing returns at seven candies.
Maybe the shape of the bottle limits the amount of coke that can actually escape the bottle, regardless of the number of candies.
My point is that if I had brought an unlimited supply of Diet Coke bottles that day (I only brought three), we could have tested every child's hypothesis by re-running the experiment slightly differently until, via a process of elimination, we were able to validate a final hypothesis.
Now if you think back to high school science class, I am willing to bet that this is something you learned in that class (or at least you were taught it.. whether you actually learned it or not I suppose is a different story).
For me, I actually did learn it, but then forgot about it after my studies moved away from science, towards economics and business.
And it was only many years later... even after working at McKinsey for awhile, that it finally dawned on me that all we're doing in consulting is running the scientific method on business problems.. instead of science ones.
It seems so obvious in hindsight, but I simply did not grasp the idea that....
Hypothesis in science = hypothesis in case interviews.
Actually to be more precise, the connection that I really did not fully appreciate at the time was how...
Scientific Experiment Design = Case Interview Issue Tree Design
I never really thought of a case interview as an exercise in designing an experiment.
LOL. I always thought the case interview was just some way an employer could torture you before giving you a job!
I was so distracted by the new terminology and concepts... the stress of a live interview... and all this other "stuff", that I just didn't realize a case interview issue tree design was really the equivalent of designing an experiment in high school science class.
So the "problem structuring" that firms refer to in a case interview is really the combination of your hypothesis + issue tree... it is the "design your science experiment" aspect of the case.
Then as you work each branch of your issue tree requesting data, all you are doing is running the experiment you previously designed, collecting data, refining your hypothesis, and repeating the whole process.
It is my emphasis on the hypothesis-driven approach, rather than the framework-driven one, that is the reason behind why I get so many seemingly contradictory success stories.
I get as many success stories that say:
1a) "Your case interview preparation materials helped me get my offers, but I did not use any of your frameworks."
as I do those that say:
1b) "Your case interview practice materials helped me get my offers, and I used all of your frameworks -- and found them very flexible."
At first it seems like these two people are talking about completely opposite things. But in fact, they are talking about the same thing.
Some people did not like my particular approach to testing a hypothesis because it was not a method that matched their intuitive level of thinking. In those cases, some people would create their own frameworks and used them instead.
Keep in mind there often are multiple ways to test a hypothesis. It does not matter which one you use, so long as your approach actually does test the hypothesis.
Others found my frameworks to be intuitive to their natural way of thinking. So they did use my frameworks in a flexible way -- the emphasis is on the word "flexible".
When people use the word "flexibility" as it relates to my frameworks, what they are really saying is: "I used just one part of your framework for this particular case because that was all I needed to prove my hypothesis."
Or: "I used the whole framework, but within each branch, I skipped several items you had listed because those specific pieces of data were not helpful in testing the specific hypothesis I had for that specific case."
You'll notice the common thread between these two types of success stories is letting the hypothesis dictate how the framework or issue trees needs to be structured.
It is often said that if you want to truly master a topic, try to teach it... and then you will be force to master it in order to explain it to someone else.
Given that rule of thumb, I suppose I am still learning about the case interview because I keep noticing new things when I am teaching it that I did not notice when I was a candidate.
To "ace the case", for example, the key is not to memorize a ton of frameworks and spit them back out at the interviewer (a.k.a. "framework vomit").
The key is to master a critical thinking process -- a process that for every case situation is both similar and different to prior instances of using that process.
It is not easy to teach a thinking process where how the process is applied in a case varies from one case to another.
You can't create a set of rigid rules to follow... only a set of "rules of thumb."
You can't give a structured recipe to follow (always do X first, then Y)... you can only provide a set of principles (that you then have to apply differently on a case-by-case basis).
It is like watching a master chef who cooks without use of recipes or measuring cups...
You wonder, "How in the world do they know when to add a certain ingredient or how much to add?"
When you ask a chef this question, invariably they have some mental rules of thumb that are used to determine the answer to these questions.
They watch the timing of when the water boils vs. watching a clock.
They look to see when certain items in the soup float, indicating they are ready to be served vs. using a thermometer.
They look for a certain color in the dish that indicates the right proportions of ingredients have been added.
So after you explain these rules of thumb, how in the world do you teach the overall process?
It turns out the best way I've found to do so is to teach by example and demonstration.
For example, in Look Over My Shoulder®, for most cases in that program, I have three different candidates doing the exact same case.
One poorly, one okay but still with some flaws, and one perfectly.
By being able to hear the differences between the candidates yourself, you're better able to see how the many concepts I talk about are actually all brought together at the same time, while under pressure.
It is one thing to know you're supposed to use a hypothesis. It is another thing entirely to notice when someone else does it incorrectly. And it is yet another thing to be able to do it yourself while under the natural stress of an interview.
The purpose of Look Over My Shoulder® and practicing cases with a partner is to shift case interview knowledge and turn it into case interview habit.
The two are most definitely not the same thing.
This is also why there is a very high correlation between those candidates that practice a lot and those that get offers.
It is not easy to convert knowledge into habit without practice.
So don't just make note of my comments today about the role of the hypothesis.
Go out and practice this approach today.
If you don't have a practice partner, the next best thing is to use the Look Over My Shoulder® program -- at least five times to build that kind of habitual familiarity with all the key skills.
If you are using LOMS correctly, you should be able to spot all the mistakes certain candidates make in a case before I point them out in my recorded commentary.
This is a good sign that your case interview knowledge is becoming an instinctive habit.
One way or another, you want to turn that conceptual knowledge into real world habit before your next interview.