Cognitive Biases: Flaw or Feature?

Cognitive biases, prejudices and short-cuts, are bad, right? Anything that gets in the way of making rational decisions necessarily reduces our welfare. Our natural inclination towards the current moment, aversion to loss, inattention and procrastination leads many of us to under-save for retirement, with potentially disastrous consequences for ourselves and society. Do these self-sabotaging biases reflect a flaw in our brains?

Perhaps not. Martie Haselton et. al. (2014) and here suggests humans may have developed these biases over millenia because they were adaptive; they improved our species’ chances of survival and procreation. Since we have had so much time to develop these biases and associated rules of thumb (“heuristics”, for example, avoid people who appear to be sick), we should be very good at using them. Could there be specific circumstances in which we can rely on biases and heuristics to outperform slower and more complex decision processes?

Haselton et. al. argue that there may be three categories of explanation for “apparent design flaws” in the human mind. These include:

  1. Natural selection may favor heuristics over potentially more accurate decision processes because they are faster and require less information. For example, the heuristic of representativeness and clustering bias may have been a useful shortcut because ancient hunter-gatherers found food in clumps, not evenly and randomly distributed across the landscape.
  2. Biases may appear as artifacts of experimentation. The mind evolved to perform tasks in the ancestral environment, not those presented by social scientists in the laboratory. For example, people find logic problems like if p, then q, enormously challenging when presented abstractly. However, when presented as a “cheater-detection” problem (if person A helps B, B had better reciprocate), a skill which continues to be helpful to modern humans, we do much better.
  3. Error Management Theory suggests that the best decision rule doesn’t necessarily minimize the number of mistakes, but rather ensures the net consequences of the decisions are adaptive over the long run. For example, overconfidence and confirmation bias could be beneficial insofar as the net benefits of inflated self-confidence, resulting in risk-taking and leadership, exceeds the costs of passivity, of not trying at all. Would we have evolved at all without a bias, at least among some humans, towards ambition and perseverance?
My two take aways from this are:
  1. We should trust biases and use associated heuristics rather than slow deliberation when the context is similar to that which existed in the ancestral environment. For example, a farmer or astronomer may will rely on their intuitions about about the optimal planting strategy or distribution of matter in the post-big bang universe since these contexts are likely to exhibit clumping.
  2. We should frame problems and decisions in ways that are not “evolutionarily novel”. For example, doctors are presented survivability rates in terms of frequencies rather than probabilities to improve their ability to interpret the results.

In sum, our brains may not have evolved to find to find truth. Rather, they’ve evolved to allow us to survive and propagate as a species. When confronted with decisions in the modern world, we should work with what we’ve got to make wise choices.

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.