Can you trust your brain when making decisions?

Feb 08, 2021

5 mins

Can you trust your brain when making decisions?
author
Laetitia VitaudLab expert

Future of work author and speaker

The notion of homo economicus (economic man) has been a significant principle in business and economics since the phrase was coined in the late 19th century. It defines people as rational, narrowly self-interested, and rather predictable beings who are constantly seeking to “maximize their usefulness”. The notion of homo economicus can even be viewed as the basis for all classical economics and has been used to model both human behavior and organizational efficiencies.

The problem is that homo economicus is actually an idea that has little basis in reality. Economic orthodoxies, dominated as they are by this fiction, are based on assumptions of rational behavior that overlook how people make decisions. These decisions are influenced by such things as inherited behavior patterns and day-to-day environment.

For the past 20 years or so, heterodox economics, which is any economic thought or theory that contrasts with orthodox schools of thought, has been gaining more and more influence. This has led to further insight into human behavior, particularly its inconsistent and irrational nature. Behavioral economics became popular during this period, with Daniel Kahneman winning the Nobel Prize in Economics in 2002 and Richard Thaler receiving the award in 2017. It seeks to build on the study of individual decision-making by emphasizing psychology.

Behavioral economics, which has become mainstream thanks to Kahneman, Thaler, and others, is a meeting of economics and psychology that seeks to allow a better understanding of the collective dimension of our behaviors. Some examples are the “sheep-like” behaviors found at play in financial markets, as well as the way those behaviors are influenced by history, the environment, culture, and emotions.

“Intelligence is not only the ability to reason; it is also the ability to find the relevant material in memory and to deploy attention when needed.” - Daniel Kahneman in Thinking, Fast and Slow

In 2011, with his book Thinking, Fast and SlowDaniel Kahneman advanced the idea that “we are all biased”. The book has had a profound impact on our understanding of the effects of bias in the realm of human resources. Building on the advances in experimental psychology made by Kahneman and his colleague Amos Tversky, he dealt a near-fatal blow to the prevailing domination of rational choice theory in businesses.

The main theory that the book seeks to explore is that there are two thought systems within all of us that constantly confront one another. These are “fast thinking” and “slow thinking”.

Fast thinking, or “System 1”, is intuitive and emotionally dominated. Slow thinking, or “System 2”, is deliberate and logical, but it costs us much more time and energy. As such, it’s not possible to use it for all day-to-day decisions.

System 1 and System 2 are useful allegories.

“Indeed, there is evidence that people are more likely to be influenced by empty persuasive messages, such as commercials, when they are tired and depleted.” - Daniel Kahneman in Thinking, Fast and Slow

Kahneman’s distinction between the two ways of making decisions, one intuitive and one logical, is nothing new in psychology. The book is somewhat repetitive, associating each of these two systems with an allegory or example. As a result, you use your own System 1 to understand both System 1 and System 2.

System 1 is fast, intuitive, and often involuntary. System 2 is analytical, concentrated, and slow. Allegories make it easier to understand and save us time. They also help in demonstrating the interactions between the two systems.

System 2 depends on the abilities of each individual. When using it, you may feel tired and have increased difficulty in controlling your impulses, which relate to System 1. This is referred to as ego depletion, the idea that willpower draws upon a limited reservoir of mental resources. A task that requires a high degree of self-control can therefore influence a subsequent task due to the fatigue it causes, even if the tasks are otherwise unrelated. This is because the individual then becomes more impulsive and, consequently, more subject to his System 1.

Individuals attempt to avoid this fatigue and save resources by relying mostly on the insights gained from their System 1. Depending on your individual level of vigilance, as well as any training you may have received, it’s possible to force yourself to question your intuition to avoid bias and act more rationally.

In System 1, cognitive fluency is valued as it facilitates the rapid processing of information. But this ease of understanding may create illusions, as people tend to take as true that which they find easy to understand. When something is presented clearly, such as parts of a text that are bold or highlighted, it will often inspire more confidence. Conversely, parts that are unclear or puzzling will awaken the vigilance of System 2, where you are more inclined to critically question what something really means.

Overconfidence and hasty judgments

“It is much easier, as well as far more enjoyable, to identify and label the mistakes of others than to recognize our own.” - Daniel Kahneman in Thinking, Fast and Slow

The main strength, as well as the main weakness, of System 1 is that it is quick, drawing rapid conclusions without taking into account the complexity and ambiguity of some information, though this can lead to over-hasty decision-making. A good example of an over-reliance on System 1 is the halo effect. As explained in our article on this: “The halo effect particularly skews talent recruitment. It explains why decisions are sometimes taken by recruiters within the first moments of the interview, or even before the interview when the halo effect occurs due to the candidate’s name or where they are from. It leads to discrimination in hiring and many recruitment errors.”

For the sake of consistency and perhaps laziness, people often avoid dealing with contradictory information. It can be seen as preferable to “eliminate” all information that might impinge upon rapid decision-making. What Kahneman calls “what you see is all there is (WYSIATI) explains a significant number of biases, including overconfidence, exposure bias, the halo effect, and retrospective bias, as well as others.

It also explains why you might avoid answering difficult questions by answering easier ones instead—something you don’t always know you are doing! Additionally, the two systems sometimes work together to better preserve coherence: System 2 looks for information that confirms the intuition of System 1, rather than looking to find information that may invalidate it.

Intuition isn’t always bad, but…

“Mood evidently affects the operation of System 1: when we are uncomfortable and unhappy, we lose touch with our intuition.” - Daniel Kahneman in Thinking, Fast and Slow

Not everything produced by System 1 is necessarily bad and a lot of hunches work out. It should also be said that, in certain situations, your intuition can even save your life. Even so, your intuition should not be given too much weight. Kahneman discusses in detail the decisions made by recruiters and managers in companies. He says that recruiters must actively avoid relying on their intuition, as it acts as a serious bias in their decision-making. As such, it’s necessary to create processes that prevent System 1 from playing a role in HR decisions.

When several similar situations recur again and again, intuition can act as a kind of recognition. As Herbert Simon, an economist and writer of many works that have influenced artificial intelligence, wrote: “The situation provides a clue; this clue gives the expert access to information stored in his memory, and this information, in turn, gives him the answer. Intuition is nothing more and nothing less than recognition.”

Sadly, situations involving individuals are not like a game of chess and are more likely to be different than similar. In the workplace, it is usually best to avoid relying on intuition too much, as the overconfidence it causes can easily be confused for expertise.

Since the publication of Thinking, Fast and Slow and the introduction of the ideas of Systems 1 and 2, the book has become widely regarded as a classic. Anyone looking to gain a better understanding of cognitive biases can find in Kahneman’s work a wealth of material on what these biases are, why they occur and how to remedy them. Many other books have since been produced on the subject, but few reach the level of insight found in Thinking, Fast and Slow.

Translated by Andrea Schwam

Photo: Welcome to the Jungle

Follow Welcome to the Jungle on Facebook, LinkedIn, and Instagram, and subscribe to our newsletter to get our latest articles every day!

Topics discussed