Why are we so bad in making estimates?

10.05.2017
Share

Estimates most often go wrong. When it comes to the economy, politics and everything else we do, we rarely see the future as it will be. Why are we so bad in this?

The reasons why making estimates is so difficult naturally depends on the type of estimate we are trying to make. Nevertheless, many errors are based on similar things. Quite a number of errors are caused by our inability to handle information.

People have discovered dozens of different cognitive biases. That is why not all of them can be fully discussed here. However, I would like to present a few biases that seem to prevail, no matter where and when estimates are made.

Protective mechanisms of the mind

Different protective mechanisms of the mind that aim to maintain a positive self-image are parts of our personal psychology. It is difficult to learn from mistakes if we do not admit that we have made them, not even to ourselves. Optimists may believe in their abilities, even when they no longer should.

A certain amount of optimism is only for good and normal. For example, most of us believe that accidents are more likely to happen to others than to themselves. However, optimism can also go too far.

This unfounded optimism is associated with a delusion, whereby we credit ourselves for all of our successes and blame others for failures. This fundamental attribution error can be seen in every human activity. We always see our own actions in a positive light.

A delusion becomes hazardous, for example, when we take too many risks in business – when optimism overrules cold probability calculations. This often leads to poor decisions and, in the worst case, drains the cash reserves of the company.

Anchoring

Anchoring means that we are fixated on something we heard the first time. We have a tendency – especially when there is a lack of better judgement – to desperately cling to the first piece of information which is offered to us.

Are there more or fewer than 500 million trees in Finland? Some say more, some say fewer. But, no matter what you say, your estimate of the number of trees is probably close to 500 million. However, the correct number is more than 70 billion.

If you have no knowledge of the subject, you are especially susceptible to being led by an anchor. This also applies when there is not enough time to calculate the answer.

An interesting example is related to the anchoring of the personal identity code. In a study (Ariely, 2004), students were asked to write down the last digits of their code on a piece of paper. Next, they were asked to estimate the prices of a computer, wine and chocolate.

The students whose personal identity code ended in large digits tended to have higher estimations. Then again, the students whose coded ended in small digits estimated lower prices.

As the example shows, we have a strong tendency to fill gaps with any information that is available, no matter how arbitrary it is. Once the anchor has found its way to us, it is difficult to get rid of it.

Mathematical ideal

A bit more abstract but equally human is our tendency to see the world as mathematically regular. Mixing genuine uncertainty with probabilities is a good example of this bias.

The surrounding chaotic world is not like a casino where the risks of events can be calculated objectively. Instead, the real world has two types of information gaps: 1) Risks that can be rare but that can also be calculated at certain probability, and 2) genuine uncertainty which simply cannot be known in any way.

In his excellent book The Black Swan, Nassim Taleb shows how mixing these two will easily lead to misconceptions. The result is that we try to model something that cannot be modeled, causing our estimates to be inaccurate.

Often, the selection of incorrect tools also makes mathematical modelling more difficult. In the real world, not everything follows the assumption of a normal distribution which is inbred in many estimates. The world is never linear, and it never has a normal distribution.

A typical example of incorrect forecasts is the 2008 financial crisis. The risk analyses of many banks were totally ignorant of the probability of such an event.

What is the moral of this story?

There are loads of similar examples. The peculiarities of the mind are not merely academic quirks – they are factors that affect every decision we make. That is why we should be aware of them.

However, being aware is not enough. Probably the only way to reduce the impact of these peculiarities is systematic training which also involves feedback. How many are actually willing to start practicing? Self-deception is easy, also in this case.

In the fall, Psycon will offer training regarding these peculiarities and their elimination. Stay in touch, if you are interested in making better decisions!

PS. I would like to thank Thomas Brand and Samu Mielonen who commented on this blog.

Further reading:

Ariely, D. (2004), available at: http://ww2.cfo.com/human-capital-careers/2004/06/avoiding-decision-traps/
Taleb, N. (2007). The Black Swan: The Impact of the Highly Improbable. Helsinki: Terra Cognita.

Mikael Nederström