I recently finished reading—no, devouring, Daniel Kahneman’s masterpiece of a book, Thinking, Fast and Slow.
Kahneman is the winner of the 2002 Nobel prize in economics, even though he’s technically a psychologist. Most of his work has focused on the psychology of decision making.
Thinking is one of the best business books I’ve read in the past several years. Basically, it is a comprehensive summary of many of the major findings of psychological research from the past 50 years as applied to problems we commonly face in business and other organizations.
There were four topics in the book I found particularly interesting:
- Biases in planning and forecasting, and how to avoid them
- Loss aversion, and why companies throw good money after bad
- When to trust intuition
- The superiority of algorithms in personnel selection
I’ll cover the first topic in this post and summarize the others in subsequent posts.
Kahneman discusses five reasons plans and forecasts are often inaccurate:
- Exclusive use of an “inside view.” The forecaster only considers personal experience and information about the specific project, team, or instance.
- Insensitivity to the quality of information.
- Optimism and overconfidence. The forecaster only anticipates problems that can be seen or predicted today. He fails to consider the plans of others, the role of luck, or Donald Rumsfeld’s famous “unknown unknowns.”
- Political incentives. These can be caused by competition among projects for approval or a desire to deceive.
- Substitution of judgments of plausibility or representativeness for judgments of probability.
Making plans more robust requires counteracting each of these potential pitfalls.
One way to do this is to use an “outside view.” Start with a relevant base rate or the performance of a reference class and adjust from there based on the quality of evidence. For example, suppose you are considering an IT investment that is supposed to take 12 months and cost $10 million. You might look at historical IT investments to see how those projects’ projected and actual timelines and costs compare. If you found that such historical projects took 25% longer and cost 33% more than initially projected, you might then consider the strength of any evidence (or lack of evidence) that this current project will be the exception and be delivered on-time and on-budget.
Another approach is to use a premortem. Ask the a group of people knowledgeable about the decision to participate as follows: “Imagine that we are a year into the future. We implemented the plan as it now exists. The outcome was a disaster. Please take 5 to 10 minutes to write a brief history of that disaster.”
At Woodlawn, we often use scenario planning to help our clients see and internalize the assumptions one has to make about the resolution of key uncertainties for different versions of the future to come to pass. As part of this process, clients often ask us to help them rate the probability of each scenario occurring.
As useful as scenarios are, I’ve often been uncomfortable with assigning probabilities, and Kahneman tells us why this should be so in Thinking. Logically speaking, the more detail in a scenario, the lower the probability it will come precisely true. However, the same detail makes the narrative richer and more plausible-sounding, making it seem more probable. To avoid biasing plans and forecasts, scenarios should be a tool to help expand management’s view of possible future states, not to narrow management’s view to down to three or four or five very specific outcomes.
I made three other posts about other topics in this book. The next one is about loss aversion, and why companies often throw good money after bad.