First off, there are, I believe, really two reasons why we’re so bad at making estimates. The first is the sort of irreducible one: writing software involves figuring out something in such incredibly precise detail that you can tell a computer how to do it. And the problem is that, hidden in the parts you don’t fully understand when you start, there are often these problems that will explode and just utterly screw you.
And this is genuinely irreducible. If you do “fully understand” something, you’ve got a library or existing piece of software that does that thing, and you’re not writing anything. Otherwise, there is uncertainty, and it will often blow up. And those blow ups can take anywhere from one day to one year to beyond the heat death of the universe to resolve.
Why You Suck at Making Estimates, Part II: Overconfidence
Kahneman talks at some length about the problem of “experts” making predictions. In a shockingly wide variety of situations, those predictions turn out to be utterly useless. Specifically, in many, many situations, the following three things hold true:
1- “Expert” predictions about some future event are so completely unreliable as to be basically meaningless
2- Nonetheless, the experts in question are extremely confident about the accuracy of their predictions
3- And, best of all: absolutely nothing seems to be able to diminish the confidence that experts feel
What It Feels Like To Be Wrong: Systems I & II, and The 3 Weeks and 3 Months Problem
In Thinking Fast and Slow, Kahneman explains a great deal of psychology as the interplay between two “systems” which govern our thoughts: System I and System II. My far-too-brief summary would be “System II does careful, rational, analytical thinking, and System I does quick, heuristic, pattern matching thinking”.
And, crucially, it’s as if evolution designed the whole thing with a key goal of keeping System II from having to do too much. Which makes plenty of sense from an evolutionary perspective — System II is slow as molasses, and incredibly costly, it should only be deployed in very, very rare situations. But you see the problem, no doubt: without thinking, how does your mind know when to invoke System II? From this perspective, many of the various “cognitive biases” of psychology make sense as elegant engineering solutions to a brutal real-world problem: how to apportion attention in real time.