Here's a sample data sufficiency mini-case question that at McKinsey partner asked me many years ago. See my reply below.
Volvo – The Safest Car in the United States*
* New US government report shows that fewer people die in a Volvo than in any other car brand in America
Assess the validity of this statement, you have 3 – 5 minutes to do so. Go!
BEFORE you look at the answer below (I know it's tempting), I strongly recommend you try ANSWERING the question yourself first.
To do so, click here to post an anonymous answer and to see how 1,300+ other people answered the question... THEN come back and look at the answer below. You will get a lot more value out of it this way because frankly there are very FEW practice opportunities available. So don't use them up by just reading them, use them by ATTEMPTING to answer them first.
Click Here to Submit an Answer to This McKinsey Problem Solving Test - Example Test Question
The validity of the claim that Volvo has the safest car in the US because fewer people die in it is at best ambiguous. I have several concerns about this claim:
1) The definition of "safe" -- the statement assumes that a safe car is only one where someone does not die IN the car. But it's possible the passenger died later or suffered severe injury.
Now lets assume that a safer car really is one where fewer people die in it, the next concerns I have are:
2) # cars on the road - Perhaps Volvo has FEWER cars on the road and you'd expect them to have FEWER deaths in their cars because of it. To test this, I would need to know how many cars from Volvo are on the road vs. other brands, and compare the relative market share of cars to share of deaths.
3) # passengers in the car - Perhaps single people drive Volvo's so there's only one person in the car, compared to say a Toyota which is perhaps a family car which perhaps carries more passengers. So maybe when a Volvo crashes it kills the 1 person in the car more OFTEN, but in other car brands people die LESS OFTEN, but there are MORE passengers. To test this, I would need to know the average number of passengers per trip in Volvo cars vs. other manufacturers.
4) # accidents - Perhaps Volvos get in more accidents a lot more often than other cars, but perhaps once you get in an accident you're less likely to die. For example, maybe Volvo brakes don't work so you crash all the time, but the Volvo body frame construction and airbags are excellent. To test this, I'd need to know how many accidents involve Volvos compared with other cars -- especially in comparison to their relative market shares.
5) Mileage - Perhaps Volvos are driven less often for shorter distances than cars from other manufacturers. If Volvo's are driven less often, for fewer miles, then it's possible Volvos have less time a risk of being in an accident -- so it's possible the car is actually more dangerous, but used less. To test this, I'd need to know how many miles per year the average Volvo is driven compared to other car brands.
6) The driver - Perhaps Volvo's are not actually safer cars, but perhaps Volvo DRIVERS are safer. This might be hard to test with data, but to start it would be useful to get the accident history of drivers who own Volvos when they drive NON-Volvo cars.
So to summarize conceptually
# deaths* =
(# Volvos on the road) X
(# passengers in the car) X
(% chance of having an accident) X
(% likelihood of dying in the event of an accident**) X
(Miles driven per year) X
(Driver's Likelihood of getting into an accident)
* assuming fewer deaths = safe car
** this can be computed from the original statement
Critiquing Other People's Answers
In reviewing the hundreds of answers people submitted for this question, here are a few observations.
1) Many people fixated on just one or two problems with the statement. To really answer this question correctly, you need to be more thorough and complete.
2) Some people gave only philosophical answers -- you can't really trust the government, it's an overly narrow definition... etc.
In consulting and in case interviews, you need to be more specific.
3) The answers that I would rate as pretty good are ones that included at least 4 of the items above and did so preferably in some systematic way
(Note the formula above wasn't necessary to state explicitly (I didn't in my interview and still passed), but the formula-like approach is consistent with how the top candidates tend to think about ANY kind of problem)
4) Many people said the claim was "incorrect". I don't think this is a factually supported statement. It is more accurate to say the validity of the claim is ambiguous or unclear, because additional information is needed or it depends on how certain words like "safe" are defined. So if you said the claim was "wrong" or "incorrect" on the McKinsey Problem Solving Test or if asked verbally in the middle of a case interview, I would consider the response as "imprecise" (consulting speak for "wrong").
Some people object that they said the claim was "wrong" they really meant the argument is flawed and therefore the conclusion may or may not be correct. BUT, they didn't SAY (or in our case write) that. And if you don't say (EXACTLY) what you mean, then you don't mean what you say.
Does this seem picky? ABSOLUTELY.
And it IS actually that important. If you tell a client they are "wrong" when what you really mean is their rationale has a logical flaw or ambiguity in it, they will take offense and be unconvinced at the former, but would be persuaded by the latter. So, the big takeaway is:
CHOOSE YOUR WORDS VERY, VERY PRECISELY.
When I posted this question previously, I received 1,300+ answers from other readers of my blog (aka your potential competitors in interviews). You can see how others answered the same question here McKinsey Problem Solving Test Example 1 and see how your response stacked up to others.