top of page

Thinking Fast and Slow

Daniel Kahneman's insights into our cognitive processes reveal how they influence our decisions, providing an invaluable toolkit to enhance our ability to understand and accept reality - one of the pillars of resilience.

Intro

Thinking Fast and Slow is a summary of Professor Daniel Kahneman research, that earned him the Nobel Prize in 2002. His work had a magnificent impact on our understanding of decision-making processes and cognitive biases.

An article by Daniel Kahneman and Amos Tversky - Prospect theory: An analysis of decision under risk, was by the end of 2016 the most cited article in the history of economy and management science.

Professor Kahneman clarifies the key findings through demonstrations instead of standard experiments. This way of presenting scientific findings is much more engaging and fun for the reader.

The main aim of the book is to improve your ability to identify and understand errors of judgment and choice in others and eventually in yourself, by providing a richer and more precise language to discuss them.

Key Ideas
Two Systems

Your brain operates using two systems: Intuitive System 1, which allows brain to think fast, and Control System 2. A complex relations between those two Systems are the main reason for cognitive biases and processes within a human brain.

System 1
  • Operates automatically and quickly, with little or no effort and no sense of voluntary control.

  • System 1 includes capabilities that you share with other animals (such as orientation, fears etc.), and mental activities that become fast and automatic through prolonged practice.

  • Many of mental actions of System 1 are involuntary - e.g. orienting to a loud unexpected sound.

  • System 1 continuously generates suggestions for System 2, which are adopted when everything runs smoothly with little or no modifications.

  • System 1 has biases, however, systematic errors that it is prone to make in specified circumstances and it can not be turned off.

Impressions and feelings originated effortlessly in System 1 are the main sources of explicit beliefs and deliberate choices of System 2.
System 2
  • System 2 is who you think you are.

  • Allocates attention to the effortful mental activities that demand it, including complex computations.

  • Its operations require attention and are disrupted when attention is drawn away.

  • It is difficult or impossible to conduct several operations of System 2 at once.

  • Intense focus can make you effectively blind due to limited attention capacity.

    • It is illustrated by The Invisible Gorilla study (Page 23)

  • System 2 normally stays in comfortable low-effort, lazy mode and is mobilised when System 1 does not offer an answer to the a question.

  • It takes over when things get difficult and normally has the last word.

The division of work between two systems is highly efficient: it minimizes effort and optimizes performance.

However, a conflict between an automatic reaction and an intention to control is common in your live (It is nicely illustrated by the reading experiments on page 25). The best you can do is a compromise: learn to recognize situations in which mistakes are likely and try harder to avoid significant mistakes when the stakes are high. Beware:

It is much easier, as well as far more enjoyable to, to identify and label the mistakes of others than to recognize your own.
Critical Situations to Monitor for Better Decision-Making

Energy depletion makes you less willing to exert self-control when the next challenge comes around:

  • If you had forced yourself to do something.

  • You made choice that involves conflict.

  • You tried to impress others.

  • You are just tired (e.g. after lunch)

As a result, you can become impulsive, aggressive, perform poorly in cognitive tasks and logical decision making (illustrated by the research where parole requests were much more likely to be granted after lunch).

While System 2 tends to be lazy, over-reliance on the intuitive System 1 can lead to irrational decision-making. The signs to watch:

  • You are satisfied with superficially attractive answers.

  • You rely always on your intuition.

Priming effects - the things, which happened in the world or in your mind have influence on your next steps (this influence is called priming). It can be an idea, activity, emotion, pretty much anything. For example, people who were asked to use the word Florida in the exercise happened to walk more slowly in the corridor after the experiment (Florida recalled the picture of elderly people). People who were forced to smile rated presented cartoons as funnier. There is countless evidence of priming and its significant impact on your judgment. There are no clear signs to watch, as everything around you can have a priming effect.

  • Priming has an extremely powerful impact on our lives. Some of examples are rather disturbing: for example the idea of money primes individualism: reluctance to be involved with others, to depend on others, or accept demands from others; reminding people of their mortality increases the appeal of authoritarian ideas.

Cognitive ease is something you experience when you find something as familiar. That maybe, however, only the impression and create significant illusions. For example, investors in the study conducted in Switzerland believed that stocks with fluent names will earn higher returns than those with clunky labels. Survival prospects are poor for an animal that is not suspicious of novelty, therefore this effect is strong.The signs to watch:

  • Happy mood loosens the control of System 2 over performance.

  • Uncomfortable and unhappy mood makes you lose touch with your intuition.

  • Frequent repetition of nonsense makes it familiar and difficult for the mind to distinguish from the truth.

You are prone to apply casual thinking to situations, which require statistical reasoning, it is called casual intuition and is often wrong. The signs to watch:

  • A capacity for surprise is essential aspect of your mental life and surprise is the most sensitive indication of how you understand your world and what you expect from it. You tend to automatically search for casualty for the events which surprise you.

You tend to quickly jump to conclusions. Conscious doubt demands mental effort and is therefore not in the repertoire of System 1. The signs to watch:

  • You are likely to believe almost anything when system 2 is otherwise engaged.

  • A tendency to like or dislike everything about a person without knowing it (Halo effect) increases the weight of first impressions.

  • First impressions can have overweight impact on your decisions.

  • What you see is what you get. Jumping to conclusions based on limited evidence is a common cognitive challenge. When you have opinion about something you know little about it is likely you are in this place.

Basic assessment is a feature of System 1, which continuously generate assessments of various aspects of the situation without specific intention and with little or no effort. It plays an important role in intuitive judgment, because they are easily substituted for more difficult questions. This is the essential idea of the heuristics and biases. Related concepts are:

  • Intensity matching, which allows matching across diverse dimensions. If crimes were colors, murder would be a deeper shade of red than theft.

  • Mental shotgun, which represent those excess computations of System 1.

Answering an easier question. When you answer difficult question, too often you tend to replace it, unconsciously, with an easier one. it is an essence of intuitive heuristics.

  • You often have answers to questions you do not completely understand or rely on evidence that you can neither explain nor defend.

  • The dominance of conclusions over arguments is most pronounced where emotions are involved.

Heuristics and biases

Judgment heuristics are simple, efficient rules, either hardwired or learned, that you often use to form judgments and make decisions. These mental shortcuts are used to speed up the process of making decisions but can lead to systematic deviations from logic, probability, or rational choice theory. Biases are systematic errors, which result from reliance on heuristics. There is no simple way for System 2 to distinguish between a skilled and a heuristic response.

System 1 is highly adept at identifying automatically and effortlessly casual connections between events, which often don't exist. One example is the law of small numbers, where you often extrapolate findings of small samples on the general population.

Anchors occur when people consider a particular value for an unknown quantity before estimating that quantity. What happens is that the estimate stays close to the the number that you considered. There are two forms on anchoring effect: deliberate adjustment done by System 2 and a priming effect resulting from System 1 operations.

Anchoring can be measured, and it is an impressively large effect, in the range of 40-50%.

Availability heuristic is the process of judging frequency by "the ease with which instances come to mind".

  • It represents a substitution of the estimation of the size of a category by the ease with which instances come to mind.

  • A paradox of availability heuristic is that the number of instances retrieved can not be too large. it is illustrated by the exercise where people were asked to list cases where they behaved assertively. Those who were asked to list six instances felt afterwards more assertive than those who were asked to list twelve. It is an ease rather than quantity, which drives availability heuristic.

Affect heuristic is an instance of substitution, in which the answer to an easy question (How do I feel about it?) serves as an answer to a much harder question (What do I think about it?).

  • One of the implications is a limitation of our mind to deal with small risks: you either ignore them altogether or give them far too much weight. E.g. terrorism being highly overweight.

Representativeness heuristic is a scientific term for stereotypes, when they are used to judge probabilities.

  • The benefit of it is that intuitive impressions that it produces are usually more accurate than chance guesses would be.

  • In other situations it will mislead, particularly if it causes people to neglect base-rate information. (Base-rate is an important feature of Bayesian statistics, explained in Chapter 14 of the book).

Conjunction fallacy happens when you judge conjunction of two events to be more probably than one of the events. In the example people judged joint events that fictitious character Lisa is a bank teller and feminist as more likely than her being only a bank teller, which is a statistical non-sense. The reason is that people tend to judge coherent stories as more likely than general ones, which are in fact not the most probably but they are plausible.

  • Adding more detail to scenarios makes them more persuasive, but less likely to come true.

Regression to the mean. After bad performance, statistically good performance can be expected.

  • A paradox of regression to the mean is that because you tend to be nice to other people when they please you and nasty when they do not, you are statistically punished for being nice and rewarded for being nasty.

  • You are looking for causal explanations ignoring statistical principle. A business commentator who correctly announces that "the business did better this year because it had done poorly last year" is likely to have a short tenure on the air.

Intuitive predictions: extreme predictions and a willingness to predict rare events from weak evidence are both manifestations of System 1.

  • Intuition is likely to deliver predictions that are too extreme and you will be inclined to put far too much faith in them.

Overconfidence

Overconfidence is a puzzling limitation of your mind: excessive confidence in what you believe you know, and your apparent inability to acknowledge the full extend of your ignorance and the uncertainty of the world you live in. It is created by several powerful illusions.

The illusion of understanding. Good stories provide a simple and coherent account of your actions and intentions. By building those stories you tend to put too large role to talent, stupidity and intentions than to luck.

  • A general limitation of the human mind is its imperfect ability to recall what you used to believe before you changed your mind. One manifestation of that is hindsight bias, which is causing you to underestimate the extent to which you were surprised by past events. It makes it easy to evaluate outcomes of decisions made by others.

  • Because adherence to standard operating procedures is difficult to second-guess, decision makers who expect to have their decisions scrutinized with hindsights are driven to bureaucratic solutions – and to an extreme reluctance to take risks.

  • The illusion that one has understood the past, feeds the further illusion that one can predict and control the future.

  • One of its manifestations is a general overly optimistic belief in the impact of CEOs on the firm performance. A correlation between the success of the firm and the quality of its CEO is i the range of 0.3, indicating 30% overlap. If you apply this correlation on many pairs of companies, one run by strong and the other by weak CEO, you would statistically see stronger performance of the strong CEO companies in 60% of the cases. This is just 10% more than random guessing.

The illusion of skills and validity. A person who acquires more knowledge develops an enhanced illusion of her skills and becomes unrealistically overconfident.

  • Highly opinionated and clear people can be compared to hedgehogs. They make great television shows. Two hedgehogs on different sides of an issue, each attacking idiotic ideas of the adversary, make for a good show.

  • Complex thinkers can be compared to foxes. The recognize that reality emerges from the interactions of many different agents and forces, including blind luck. They are, however, less likely to be invited to television debates.

The illusion that you understand the past fosters overconfidence in your ability to predict the future.

  • Low confidence can often be more informative as an indicator of accuracy.

  • Intuitions versus formulas

    • Experts tend to be inferior to formulas. One reason may be that expert try to be clever, think outside the box and consider complex combinations of features in making their predictions. Complexity more often reduces validity.

    • Experts are not consistent. For example experienced radiologists who evaluate chest X-rays contradicts themselves 20% of the time when they see the same picture on separate occasions.

    • An algorithm that is constructed on the back of an envelope is often good enough to compete with an optimally weighted formula; and certainly good enough to outdo expert judgment.

    • Algorithms can also steer standard procedures. Apgar's score for infants provided consistent standard to determine which babies were in trouble and the formula is credited for an important contribution to reducing infant mortality.

    • The hostility to algorithms is rooted in the strong preference that many people have for the natural over the synthetic or artificial.

Marital stability is well predicted by a formula: frequency of lovemaking minus frequency of quarrels.

Expert intuition can be trusted in the fields, which are characterised by an environment that is sufficiently regular to be predictable, and which provides opportunities to learn these regularities through prolonged practice. When these conditions are satisfied, intuitions are likely to be skilled. The examples are physicians, nurses, athletes, and firefighters.

  • In contrast, stock pickers and political scientists who make long-term forecasts operate in a zero-validity environment.

  • Whether the professionals have a chance to develop intuitive expertise depends essentially on the quality and speed of feedback, as well as on sufficient opportunity to practice.

  • The marker of skilled performance is the ability to deal with vast amounts of information swiftly and efficiently.

Intuition cannot be trusted in the absence of stable regularities in the environment.

Planning fallacy means that your plans and forecasts are unrealistically close to the best-case scenario.

  • This could be improved by consulting the statistics of similar cases (outside view or reference class forecasting).

Optimistic bias (planning fallacy is one of its manifestations). It may well be the most significant of the cognitive biases. Because it can be both a blessing and a risk, you should be both happy and wary if you are temperamentally optimistic.

  • Blessings of optimism are offered only to mildly biased who are not losing track of reality.

  • The evidence suggests that an optimistic bias plays a role - sometimes the dominant role - whenever individual or institutions voluntarily take on significant risks, underestimating the odds they face.

  • The confidence in the future success is very often delusional. Optimism is widespread, stubborn and costly.

  • Most people genuinely believe that they are superior to most others on most desirable traits (90% of drivers believe they are better than average).

  • Overconfident CEOs tend to underperform. The damage caused by overconfident CEOs is compounded when the business press anoints them as celebrities.

If you were allowed one wish for your child, seriously consider wishing him or her optimism (only mild however).
Choices of Econs and Humans

Economy theory is based on the assumption of rational agent. Prospect theory challenges this assumption and offers different perspective on the nature of choices. It is based on three principles:

  • Evaluation is relative to a neutral reference point so called - adaptation level. The reference point can be for example expected salary raise based on the raise received by your colleagues.

  • A principles of diminishing sensitivity apply to both sensory dimensions and the evaluation of changes of wealth. Turning on a weak light has a large effect in a dark room and may be undetectable in a bright one.

  • Loss aversion. Losses loom larger than gains. This asymmetry between the power of positive and negative expectations or experiences has evolutionary history. Prioritising threats increases survival chances.

    • Loss aversion ration has been estimated in the range of 1.5 to 2 which means that for the potential loss of 100 you would expect the potential gain of 150 to 200 to balance it.

Endowment effect is a result of loss aversion. Simply explained it means that you put more value on the goods you have. Giving up a bottle of nice wine is more painful that getting an equally good bottle is pleasurable.

  • One can mitigate this effect by asking correct question, which is "How much do I want to have this, compared with other things I could have instead?"

The brains of humans and other animals contain a mechanism that is designed to give priority to bad news.

  • The long-term success of a relationship depends far more on avoiding negative than on seeking the positive.

  • You are driven more strongly to avoid losses than to achieve gains, which can bias decisions towards loss aversion.

  • Loss aversion is a powerful conservative force that favours stability and holds your live near reference point.

The fourfold pattern. People attach values to gains and losses rather than to wealth, and the decision weights that they assign to outcomes are different from probabilities.

  • The fourfold pattern scenarios are presented below. It is considered one of the core achievements of prospect theory.

  • Paying a premium to avoid a small risk of large loss is costly, especially if you make many decisions like that as a large organization.

Broad frame - risk policy is an example of that. A broad frame embeds a particular choice in a set of similar choices. If you see each gamble as part of a bundle of gambles you will get significantly closer to economic rationality: you win a few, you lose a few. The main purpose is to control your emotional response when you do lose.

Money is a proxy for points on a scale of self-regard and achievement. You carefully keep score of them. Examples are:

  • Disposition effect bias that is a preference for selling winners rather than losers in your investment portfolio. It is an example of narrow framing and can be very costly.

  • Sunk cost fallacy that makes companies continue with troubled initiatives, which already costed money, rather than invest in a new, more promising project. Sunk cost fallacy keeps people too long in poor jobs, unhappy marriages, and unpromising research projects.

  • Regret - the asymmetry in the risk of regret favors conventional and risk-averse choices. For example consumers favor brand names over generics to avoid the risk of regret.

  • The intense aversion to trading increased risk for some other advantage plays big role in the laws and regulations governing risk. This trend is especially strong in Europe.

Reversals are related to the tendency to change evaluation when the specific case is presented jointly with another one.

  • Joint evaluations, which trigger comparative judgment, necessarily involve System 2 and are more like to be stable than single evaluations, which often reflect intensity of emotional responses of System 1.

  • The legal system, contrary to psychological common sense, favors single evaluations.

Frames and reality. Unless there is an obvious reason to do otherwise, most of us passively accept decision problems as they are framed and therefore rarely have an opportunity to discover the extent to which our preferences are frame-bound rather than reality-bound.

  • An article published in 2003 noted that the rate of organ donations was close to 100% in Austria and only 12% in Germany, 86% in Sweden and only 4% in Denmark. The difference is caused by the format of the critical question. The high-donation countries have an opt out form (you have to check if you do not wish to donate). The low-donation countries have opt-in form.

Experiencing self and remembering self

Experiencing self is the one that answers the question: "Does it hurt now?" The remembering self is the one that answers the question: "How was it, on the whole?"

What you remember is not exactly what you have experienced.

What you learn from the past is to maximize the qualities of your future memories, not necessarily of your future experience - the tyranny of the remembering self.

Tastes and decisions are shaped by memories, and the memories can be wrong.

Remembering self composes life as a story, which is about significant events and memorable moments, not about time passing.

Duration neglect is normal in a story, and the ending often defines its character. We all care intensely for the narrative of our own life and very much want it to be a good story, with a decent hero.

The neglect of duration combined with the peak-end rule causes a bias that favors a short period of intense joy over a long period of moderate happiness.

General satisfaction with life is a question addressed to your remembering self and as such does not provide an answer to the daily well-being you actually experience. Measures of experienced well-being need to be different and refer to the use of time and actual activities.

A disposition for wellbeing is as heritable as height or intelligence. People who appear equally fortunate very greatly in how happy they are.

One recipe for a dissatisfied adulthood is setting goals that are especially difficult to attain. A concept of well-being cannot ignore what people want.

Focusing illusion gives more weight to the aspect of life to which attention is directed. It can be summarised in a single sentence:

Nothing in life is as important as you think it is when you are thinking about it.

The focusing illusion creates a bias in favor of goods and experiences that are initially exciting, even if they will eventually lose their appeal. Time is neglected, causing experiences that will retain their attention value in the long term to be appreciated less than they deserve to be.

Techniques

Professor Kahneman is not optimistic about the potential for personal control of biases. He is proposing, nevertheless, some techniques to mitigate, at least partially, their impact.

Recognize the signs that you are in a cognitive minefield, slow down, and activate System 2.

Attention training improves executive control and performance on nonverbal tests of intelligence. Attention is argued to be something else than intelligence. Attention is more indicative of the individual ability to avoid cognitive biases.

When deriving the most useful information from multiple sources of evidence, you should always try to make these sources independent of each other. This rule is part of a good police procedure and helps to prevent unbiased witnesses from influencing each other.

Simple meeting rule: before an issue is discussed, all members of the committee should be asked to write a brief summary of their position. Standard practice of open discussion gives too much weight to the opinions of those who speak early and assertively, causing others to line up behind them.

Using simple algorithms and checklists helps to overcome many biases and tend to consistently outperform experts.

Outside view - using information from other ventures similar to that being forecasted is a cure for planning fallacy.

The premortem – tool to mitigate optimistic bias. "Imagine that we are a year into the future. We implemented the plan as it now exists. The outcome was a disaster. Please write a brief history of that disaster".

Risk policies help to mitigate exaggerated optimism of the planning fallacy and the exaggerated caution induced by loss aversion.

Organizations are better than individuals when it comes to avoiding errors, because they naturally think more slowly and have the power to impose orderly procedures. They are factories that manufacture judgments and decisions. The production stages are: the framing of the problem, the collection of relevant information, reflection and review. Efficiency improvements should be sought at each of these stages.

bottom of page