Thinking, Fast and Slow Summary

Thinking, Fast and Slow summary
Thinking, Fast and Slow by Daniel Kahneman

Thinking, Fast and Slow Summary explains Daniel Kahneman’s contribution to our current understanding of psychology and behavioral economics. Through Thinking, Fast and Slow, we now understand how decisions are made, why certain errors in judgment are so common, and how we can improve.

Thinking, Fast and Slow summary
Thinking, Fast and Slow by Daniel Kahneman

Daniel Kahneman’s Perspective

Daniel Kahneman, Ph.D., received the Nobel Prize in Economics in 2002. He is a Principal Investigator at Woodrow Wilson School of Public and International Affairs, Emeritus Professor of Psychology and Public Affairs (Woodrow Wilson School), Eugene Higgins Professor Emeritus of psychology at Princeton University, and a fellow of the Center-for-Rationality at the Hebrew University of Jerusalem.

Who Should Read Thinking, Fast and Slow?

  • Anyone interested in how our minds work, how we solve problems, how we judge, and what weaknesses our minds tend to be.
  • Anyone interested in Nobel Laureate Daniel Kahneman’s contributions to psychology and behavioral economics and how these contributions apply to society as a whole
  • And anyone bad at mental math…

Introduction

Daniel Kahneman’s Thinking, Fast and Slow which was released in 2011 is a summary of decades of research leading to his Nobel Prize-winning – explains his contribution to our current understanding of psychology and behavioral economics. Kahneman and his colleagues, whose work is covered extensively in this book, have made significant contributions to a new understanding of the human mind over the years. As a result, we now understand how decisions are made, why certain errors in judgment are so common, and how we can improve.

Would you like to school your brain? Let’s take a trip through the mind!

CHAPTER ONE

How Our Behavior Is Driven By Two Distinct Systems

Of two minds: how our behavior is driven by two distinct systems – one automatic and the other considered.

A compelling drama is playing out in our heads, cinematic action between two main characters with twists, drama, and tension. These two components are the Impulsive, Automatic, Intuitive System 1 and the Thoughtful, Deliberate, Calculated System 2. When they play against each other, their interactions shape our thinking, judgment, decision, and action.

System 1 is the section of our brain that works intuitively and suddenly, usually without our conscious control. You can be aware or experience this system at work when you hear a thunderous and unexpected sound. What do you do? Chances are, you’ll immediately and automatically focus your attention on the sound. This is system 1.

This system is a result of our evolutionary past: there are inherent advantages to survival in acting and making judgments so quickly.

System 2 is what we think when we visualize the brain’s part responsible for our individual decision-making, thinking, and beliefs. These are conscious activities in mind, such as self-control, choice, and a more deliberate focus of attention.

For instance, imagine you are looking for a woman in a crowd. Your mind is consciously focused on the task: it remembers her characteristics and anything that can help find her. This focus helps eliminate possible distractions, and you barely notice the other people in the crowd. If you keep that attention focused, you should see her within minutes; while distracted and out of focus, you will be hard-pressed to find her.

As we will see in the next blinks, the relationship between these two systems determines our behavior. 

CHAPTER TWO

The Lazy Mind

The Lazy Mind: How laziness could lead to mistakes and affect our intelligence.

Try to solve this famous racket and ball problem to see how the two systems work:

A racket, “bat”, and a ball cost $1.10. So, the racket costs a dollar more than the ball. How much is the ball?

The price that most likely crossed your mind, $0.10, is the result of the intuitive, automatic system 1, and that’s incorrect! Take a moment and calculate now.

Do you see your mistake? The correct answer is $0.05.

What happened was your Impulsive System 1 took control and automatically responded, relying on intuition. But it reacted too quickly.

Usually, in a situation that it can’t contain, System 1 calls System 2 to resolve the problem, but System 1 is mistaken for the racket and ball problem. System 1 is tricked. It sees the problem as simpler than it is and mistakenly assumes that he can handle it on its own.

The problem revealed by the racket and ball problem is our innate mental laziness. When we use our brains, we tend to use the least amount of energy for any task. This is known and called the law of least effort. Since controlling the response with System 2 would require more energy, our mind won’t do it if it thinks System 1 can only handle it.

This laziness is regrettable because the use of System 2 is an essential aspect of our intelligence. Research shows that practicing System 2 tasks like concentration and self-control leads to higher intelligence scores. The problem with the racket and ball illustrates this because our senses could have checked the answer with System 2 by avoiding this common mistake.

By keeping our mind lazy and avoiding System 2, our mind limits the power of our intelligence.

CHAPTER THREE

Reasons We Don’t Always Consciously Control Our Thoughts And Actions

Autopilot: why we don’t always consciously control our thoughts and actions.

What would your thought be when you see the word fragment “SO_P”? Probably nothing. How about looking at the word “EAT” first? If you looked at the word “SO_P” again, you would probably consider it as “SOUP.” This process is called priming.

We are primed when exposure to a word, concept, or event makes us remember related terms and concepts. For example, if you had seen the word “SHOWER” instead of “EAT” above, you would probably have spelled out the letters “SOAP.”

Such priming influences not only our way of thinking but also our actions. Just as the mind is influenced by hearing certain words and concepts, the body can also be affected. A typical or good example of this can be found in a study in which participants primed using aging-related words such as “Florida” and “wrinkles” while walking slower than usual.

Surprisingly, the priming of actions and thoughts is completely unconscious; we do it without even realizing it.

Priming, therefore, shows that, despite many assertions, we do not always have conscious control over our actions, judgments, and decisions. Instead, we constantly rely on certain social and cultural conditions.

Research by Kathleen Vohs, for example, shows that the concept of money primes individualistic actions. People who are primed on the idea of ​​money – for example, by being exposed to images of money – act more autonomously and are less willing to get involved with, depend on or accept others’ demands. One conclusion from Voh’s research is that living in a society full of triggers that prime money can turn our behavior away from altruism.

Like other elements of society, priming can influence an individual’s thoughts and, therefore, decisions, judgments, and behaviors – and these are reflected in the culture and have a profound impact on the type of society we all live in.

CHAPTER FOUR

Quick Judgments: How The Mind Makes Quick Decisions Even When It Doesn’t Have Enough Information To Make A Rational Decision

Imagine meeting someone called Ben at a party and finding it easy to talk to him. Later someone asks if you know of anyone who might want to contribute to their charity. You think of Ben, although you only know him, who you can talk to easily.

In other words, you loved one aspect of Ben’s character, so you thought you would love everything else about him. We do often approve or disapprove of a person even when we know little about them.

Our minds tend to simplify things without enough information, which often leads to errors in judgment. This is called excessive emotional coherence, also known as the halo effect: positive feelings about being close to Ben cause you to halo Ben even though you know very little about him.

But that’s not the only way our senses take shortcuts to make judgments.

There is also an affirmation bias where people tend to consent to information supporting their previous opinions and accept any information presented to them.

This can be demonstrated when we ask the question, “Is James friendly?” Studies have shown that we are very likely to find James friendly when faced with this question, but without more information – because the mind automatically confirms the proposed idea.

Both the halo effect and the change in affirmation occur because our senses like to judge quickly. However, this often leads to errors as we don’t always have enough data to make the correct call. Our minds rely on false suggestions and oversimplification to fill in data gaps, leading us to potentially wrong conclusions.

Just like with priming, these cognitive phenomena occur without our knowledge and influence our decisions, judgments, and actions.

CHAPTER FIVE

Heuristics: How The Mind Utilizes Shortcuts To Make Snap Decisions

We usually find ourselves in situations where we need to do quick assessments. To aid us do this, our minds have developed small shortcuts to understanding our surroundings immediately. These are called heuristics.

Usually, these processes are beneficial, but the problem is that our minds tend to use them too often. If we use them in situations for which they are not suitable, we can make mistakes. To better understand what heuristics are and what errors they can lead to, consider two of their many types: substitution heuristics and availability heuristics.

In the substitution heuristic, we answer a simpler question than the question actually asked.

Take this question, for example, “This woman is running for the sheriff.” How will she be successful in her office? “We’ll automatically replace the question we’re about to answer with a simpler one, like, ‘Does this woman look like someone who’s going to be a good sheriff?’

This heuristic means that instead of looking at the candidate’s background and politics, we’re just asking the much simpler question of whether this woman fits our image of a good sheriff. Unfortunately, if the woman doesn’t match our image as a sheriff, we can say no, even though she has many years of law enforcement experience, making her the perfect candidate.

Then there’s the availability heuristic, where you overestimate the likelihood of something that you hear often or that is easy to remember.

For example, a stroke causes far more deaths than accidents, but one study found that 80 percent of those polled thought accidental death was more likely. This is because we hear more about accidents in the media and the impression made on us; we are more likely to remember terrible accidental deaths than a stroke and therefore may react inappropriately to these dangers.

READ ALSO 12 Rules for Life Summary

CHAPTER SIX

No Head For The Numbers: Why We Struggle To Understand Statistics And Make Mistakes That Can Be Avoided As A Result

How can you predict if certain things are going to happen?

One effective way is to consider the base rate. This is a statistical foundation on which other statistics are based. For example, imagine that a large taxi company has 20 percent yellow taxis and 80 percent red taxis. This means that the base price for yellow taxis is 20%, and the base price for red taxis is 80%. Thus, if you want to order a cab and guess the color, just remember the base prices, and you’ll make a pretty accurate prediction.

So we always have to remember the base rate when predicting an event, but unfortunately, that doesn’t happen. In fact, neglecting the base rate is very common.

One of the reasons we ignore the base rate is to focus on what we expect, not what is most likely. For example, imagine these taxis: when you see five red taxis passing by, you probably have the impression that the next one is yellow for a change. But no matter how many taxis of the two colors pass by, the probability that the next taxi will turn red is always around 80% – and if we remember the base price, we should recognize it. But instead, we incline to focus on what to expect, a yellow cab, and we’ll probably be wrong.

Neglecting the base rate is a common mistake related to the larger problem of working with statistics. We also have a hard time remembering that everything is back to average. Finally, it is the realization that all situations have their average status and that deviations from this average will eventually revert to the average.

For example, if a footballer who scores an average of five goals a month, then scores ten goals in September, her coach will be thrilled. Still, if she scores around five goals a month for the rest of the year, the manager will likely criticize her for not continuing her hot streak, although she’s just fallen behind on average!

CHAPTER SEVEN

Past Imperfect: Why We Remember Events In Hindsight And Not Experiences

Our minds do not directly remember experiences. Instead, we have two different devices, called Memory-Selves, both of which memorize situations differently.

First, there’s the experiencing self that registers what we feel in the present. It asks the question, “How do you feel now?” ”

Then there’s the remembering self, which records how the whole event developed after that. It asks: “How was it in general? ”

The experiencing self gives a more precise picture of what happened because our emotions are always more precise during an experience. But the remembering self, which is less precise because it records the memories once the situation is over, dominates our memory.

There are two causes why the remembering self dominates the experiencing self. The first of these is known as duration neglect, where the overall duration of the event is ignored in favor of a specific memory of it. Second, there is the peak-end rule, where we put too much emphasis on what happens at the end of an event.

As an example of this dominance of the remembering self, consider this experience, which measures human memories of a painful colonoscopy. Before a colonoscopy, patients were divided into two groups: patients in one group had long, rather drawn-out colonoscopies, while patients in the other group had much shorter procedures, but the pain increased towards the end.

You might think that the most unhappy patients are those who cope with the lengthy process because their pain lasts longer. However, it was really what they were feeling at the time. When each patient was asked about pain during the study, their experiencing self-yielded the correct answer: those who underwent the longest procedures felt worse. But after the experience of remembering self-took over, those who went through the shorter process with the more painful ending felt worse. This research gives us a clear example of neglect of duration, the peak-end rule, and our imperfect memories.

CHAPTER EIGHT

Mind Over Matter: How Adjusting Our Mind Can Dramatically Affect Our Thoughts And Behavior

Our minds use distinct amounts of energy depending on the task. When there is no need for attention and a little energy is needed, we are at cognitive ease. However, when our minds need attention, they expend more energy and enter a state of cognitive stress.

These changes in brain energy levels have dramatic effects on our behavior.

Under cognitive ease, the intuitive system 1 is responsible for our minds, and the logical and more energy-consuming system 2 is weakened. This implies that we are more intuitive, more creative, and happier but also more likely to make mistakes.

In a state of cognitive stress, our consciousness increases, and therefore System 2 takes its responsibility. As a result, system 2 is more willing to guide our evaluations than System 1 so that even when we are much less creative, we make fewer mistakes.

You can consciously influence the degree of energy the mind expends to get in the right mood for specific tasks. For example, if you want a message to be compelling, try promoting cognitive ease.

One way to do this is to acquaint ourselves to repetitive information. When information is repeated or better remembered for us, it becomes more compelling. This is due to the fact that our minds have evolved to respond positively when repeatedly exposed to the same clear message. When we notice something familiar, we enter a state of cognitive ease.

Cognitive stress, on the other hand, helps us be successful with statistical issues, for example.

We can enter this state by exposing ourselves to information that is presented to us in a confusing way, such as hard-to-read letters. As a result, our minds light up and increase their energy to understand the problem, which is why we are less likely to give up.

CHAPTER NINE

Take Risks: The Way Opportunities Are Presented To Us Influences Our Risk Assessment

The way we evaluate ideas and respond to problems is strongly influenced by how they are expressed to us. Thus, small changes in the details or focus of a statement or question can drastically change the way we handle it.

Our risk assessment method is a good example.

You might think that once we can determine the likelihood of a risk occurring, everyone approaches it the same way. But this is not the case. Even with carefully calculated probabilities, the way the number is expressed can change our approach.

For example, a rare event is more likely if expressed in terms of relative frequency rather than the statistical probability.

In Mr. Jones’ experiment, two groups of psychiatrists were asked if it was safe to discharge Mr. Jones at the psychiatric hospital. The first group learned that patients like Mr. Jones had a “10% chance of committing acts of violence,” and the other group was informed that “out of 100 patients similar to Mr. Jones, it is estimated that 10 commit acts of violence”. Of the two groups, nearly twice as many respondents in the second group denied his discharge.

Another way to distract our attention from what is statistically relevant is called denominator neglect. It happens when we ignore ordinary statistics in favor of vivid mental images that influence our decisions.

Take these two statements: “This medicine protects children from disease X, but has a 0.001% risk of permanent deformity” compared to “One in 100,000 children who take this medicine will be permanently deformed”. Although both claims are the same, the latter is reminiscent of a disfigured child and has much more influence, making us less likely to administer the drug.

CHAPTER TEN

No Robots: Why We Don’t Make Decisions Based Only On Rational Thinking

How do we make decisions as individuals?

For a long time, a group of powerful and influential economists suggested that we make decisions based on purely rational arguments. They argued that we all make decisions according to utility theory, which states that when individuals make decisions, they only look at the rational facts and choose the option that gives them the best overall result. That is to say the most useful.

For example, usability theory would make this kind of statement: if you like oranges more than kiwifruit, you also have a 10% chance of winning an orange over a 10% chance of winning a kiwi.

That seems obvious.

The most influential group of economists in the field centered around the Chicago School of Economics and its most famous researcher, Milton Friedman. Using usability theory, the Chicago School argued that individuals in the marketplace are ultra-rational decision-makers, as economist Richard Thaler and lawyer Cass Sunstein eventually called it Econs. As Econs, each individual acts equally and values ​​goods and services according to their rational needs. In addition, Econs also value their wealth rationally and only weigh the benefits it will bring them.

So, imagine two people, John and Jenny, both of whom have a net worth of $5 million. According to the utility theory, they have the same wealth, which means they should be equally happy with their finances.

But what if we complicate things a bit? Let’s say their $5 million fortune is the result of a day at the casino, and the two had very different starting points: John came in with just $1 million and five times his money, while Jenny arrived with $9 million, which decreased to $5 million. Do you still think Jenny and John are just as happy with their $5 million?

Unlikely. So, it is clear that the way we value things is more important than the mere benefit.

As we will see in the section after this, we can make some strange and seemingly irrational decisions because we don’t see utility as rationally as utility theory believes.

CHAPTER ELEVEN

Gut Feeling: Why Don’t We Make Decisions Based On Rational Considerations But Are Often Influenced By Emotional Factors

If utility theory doesn’t work, what does it do?

An alternative is the prospect theory developed by the author.

Kahneman’s prospect theory objects utility theory by showing that we don’t always act in the most rational way when we make decisions.

For example, consider these two scenarios: In the first scenario, you are given $1000 and then must pick between receiving a definite $500 or taking a 50% chance to win another $1000. In the second scenario, you’re given $2000 and must choose between a sure loss of $500 or taking a 50% chance of losing $1,000.

If we had to make purely rational decisions, we would make the same decision in both cases. But this is not the case. In the first case, most people will choose a sure bet, while most people will place a bet in the second case.

The prospect theory helps explain why this is so. This illustrates at least two reasons why we don’t always act rationally. These two things shape our loss-making version – the fact that we fear losses more than we value gains.

The first reason is that we measure things against benchmarks. Starting at $ 1000 or $ 2000 in both scenarios, we are ready to play changes as the starting point influences how we value our position. The benchmark in the first scenario is $ 1000 and $ 2000 in the second, which means that the end of $ 1500 looks like a profit in the first scenario but an uncomfortable loss in the second. While our reasoning here is clearly irrational, we understand the value of our hypothesis and the real objective value at this point.

Second, we are affected by the diminishing sensitivity principle: perceived value may differ from an actual value. For example, going from $1000 to $900 is not as bad as going from $200 to $100, even though the dollar value of the two losses is the same. Likewise, in our example, the perceived depreciation is larger when you go from $1500 to $1000 than when you go from $2000 to $1500.

READ ALSO The 7 Habits of Highly Effective People Summary

CHAPTER TWELVE

False Images: Why The Mind Needs Complete Images To Explain The World, But They Lead To Pride And Mistakes

Our minds use the cognitive context; we build complete mental pictures to explain ideas and concepts. So, for example, we have a lot of images in the brain for the weather. So, we have a picture for, say, summer weather, which can be an image of a bright, warm sun bathing us in warmth.

Not only do they help us figure things out, but we also trust these pictures when making a decision.

During decisions making, we refer to these images and base our assumptions and conclusions on them. E.g, if we want to know what clothes to wear in summer, we base our decisions on our perception of the weather at that time of year.

The problem is, we trust these images too much. Even if the statistics and data available do not match our mental images, we let the images guide us. For example, the weather report can predict relatively cool weather in the summer, but you can still step out in shorts and a t-shirt because that’s what your summer mental image says you should wear. You might then end up shivering outside!

In short, we have enormous faith in our often-imperfect mental images. Though there are ways to overcome this overconfidence and make better predictions.

One way in avoiding errors is to use reference class predictions. Instead of judging by your somewhat generalized ideas, you can use specific historical examples to make a more accurate prediction. For example, think about the last time you went out on a cold summer day. What were you wearing at the time?

In addition, you can define a long-term risk policy that provides for concrete actions in the event of successful and unsuccessful predictions. Preparation and protection allow you to rely on evidence rather than general mental images and make more accurate predictions. As for our weather example, that may mean you should bring a sweater for extra security. 

Conclusion

The Key Message Of Thinking, Fast And Slow

Thinking, fast and slow, illustrates that our mind contains two systems. The first act instinctively and requires little effort; the second is more intentional and requires much more attention. Our thoughts and actions differ depending on which of the two systems currently controls our brain.

Helpful Advice

Repeat the message!

Messages are more compelling when we are repeatedly exposed to them. This is probably because we have evolved so that repeated exposure to things that did not have adverse effects is considered good on its own.

Don’t be swayed by rare statistical events that are overly reported in the newspapers.

Disasters and other events are an essential part of our history, yet we often overestimate their statistical probability due to the fact that the vivid images we associate with them in the media.

You are more creative and intuitive when you are in a better mood.

When you are in a better mood, the part of the mind that is awake and analytical tends to relax. This is because it transfers control of your mind to a more intuitive and faster thought system, making you more creative.

We’re just scratching the surface here. If you don’t already have the original book, “Thinking, Fast and Slow by Daniel Kahneman,” order it here now on Amazon to learn the juicy details.

0 Shares:
Leave a Reply
You May Also Like
summary of Think Again
Read More

Summary of Think Again

Table of Contents Show About Adam GrantIntroductionA Brief Summary Of The Major PointsConventional Vs. Alternative View Of IntelligenceThe…
Read More