Kahneman think slowly, decide quickly. The mind makes decisions quickly despite insufficient information


About what?

A book about statistics and how the brain works. Also, there is a lot about social psychology.

In this it is similar to the book "The Psychology of Influence" by Robert Cialdini and with Kahneman even refers to the Black Swan 5 times.

By the number of associations, thoughts, images and intersecting ideas that arise in the head while reading, it is comparable to a book Reading any paragraph, you will remember five more articles, somewhere once read. There is a comparison, ordering of knowledge.

The book is not "swallowed", you need to work on it, think it over and carefully put it in your head so as not to miss or forget anything. This is where this review helps me. While reading, a strong feeling of pumping your own mind is created. The book is so saturated that its full abstract by the number of characters does not fit into the LJ format of the note. There is a huge amount of interesting thoughts here and it is a pity to leave some of them without attention.

The book answers the question "Does a person have an intuitive understanding of statistics?". And the answer is "no", with certain reservations.

What does "think slow, decide fast" mean?

"Fast thinking includes both types of intuition, that is, expertknowledge and heuristics, as well as all those absolutely automatic actions of the brain in the areaperception and memory that allow you to remember the capital of Russia or determinethat there is a lamp on the table.

"From time to time in the head neither a rationally substantiated answer nor a heuristic guess comes. Suchcases, we often switch to a slower and deeper form of thinking,requiring great effort. This is "slow thinking".



Quotes:

"As it turned out, even statisticians are bad with statistical intuition."

“In the 1970s, two assumptions were generally accepted. First, most people behave rationally. Secondly, most deviations from rationality are explained by emotions: for example, fear, attachment or hatred. Our article challenged both of these assumptions. We have documented the persistent errors of thinking in normal people and found that they are due more to the mechanism of thought itself than to the disruption of the thought process under the influence of emotions.


  • About, that all kinds of voluntary efforts are cognitive,emotional or physical - at least partially use the general reserve of mental energy. O it was noted that efforts of will or self-control are tiring: ifforce yourself to do something, then the next task is the desire to control yourselfdifficult or reduced. This phenomenon is called ego depletion. test subjectswho were instructed to suppress their emotional response during the film,causing strong feelings, did poorly on the next test for physicalendurance when asked to squeeze the dynamometer hard, counteractinggrowing discomfort. Emotional efforts in the first phase of the experimentreduce the ability to tolerate pain from prolonged muscle contraction, and thusego-depleted subjects are quicker to succumb to the urge to end it. In a differentIn the experiment, subjects are first exhausted by the task of eating "healthy" foods likeradishes and celery, resisting the temptation to eat chocolate or cake. Thesethe subjects subsequently give up earlier than the rest when performing a difficultcognitive task.

  • Gilbert considers disbelief to be the action of a slow system and, in support of it, citeselegant experiment. The subjects saw meaningless statements, such as "Dinka -it's a flame," followed a few seconds later by the word "true" or "false."Later, the subjects were asked whether they remembered sentences marked with the word “true”.In one version of the experiment, participants were asked during the taskkeep numbers in mind. Interference with the slow system had a selective effect:it became difficult for the subjects to “disbelieve” in false sentences. In the next testtired participants recalled many of the false statements as true. FromFrom the foregoing, an important conclusion follows: when a slow system is busy with something else, we are ready tobelieve almost anything. The fast system is naive and tends to believe, mistrust and doubt -the prerogative of the slow system, but it is sometimes busy and often lazy. There is evidencethe fact that empty but persuasive messages - for example, advertising - have a stronger effect ontired, exhausted people.

A lot of experiments on the theme of priming. Priming made a very strong impression on me. Here are some of the experiments:

  • Sets for one group of students contained words thatassociated with older people: "Florida", "forgetful", "bald", "gray", or"wrinkles". Having finished the task, the young people had to go to another office forrunning the next test. The essence of the experiment was this short transition.The researchers quietly recorded the time it took to pass the corridor. As Bargh predicted, young people who made up sentences from words related toolder people walked along the corridor much more slowly.All this happens completely unconsciously. None of the students indid not say that he noticed the general theme of the words in the subsequent survey, and all the subjectsinsisted that the words they read had no effect on their actions after the firstexperiment. They did not grasp the idea of ​​old age, but their actions changed.

  • Experiment participantsshowed a list of five words, from which it was required to select four and make thema phrase on a topic related to money (for example, a list of "good", "work", "new","table", "paid" became the phrase "new well-paid job"). Otherpreparation methods were less obvious and included unobtrusive presenceitem related to money: a stack of dollars forgotten on the table from the game "Monopoly"or a computer screensaver depicting banknotes floating on the surface of the water.Money-minded people are more independent than they are without it.associative switch. They spent almost twice as long trying to solve a difficult problem,not asking the experimenter for help: a clear sign of heightened self-confidence.People with this attitude became more selfish and less willing to help others.students who pretended not to understand the task of the experiment. Whenthe experimenter awkwardly dropped a bunch of pencils, the participants, set (unconsciously)for money, helping him to collect, they raised less. In another experiment from the sameseries, subjects were told that they should meet a new participant, andthey were asked to arrange two chairs while the experimenter left for the future interlocutor.Those who had a money mindset tried to move their chair away (at a distance of 118cm, as opposed to 80 cm for other participants in the experiment). money minded studentsmore willing to be left alone.The common theme of all these discoveries is that the thought of money sets the mind onindividualism: unwillingness to interact with, depend on, or accept from others them requests. Some cultures oftenremind of respect, others - of God, and there are societies that give an orientation toobedience with huge portraits of the dear leader. Can there be any doubt aboutthe fact that the ubiquitous images of the national leader under dictatorial regimes do notonly instill the feeling that "Big Brother is watching you", but also reducethe number of spontaneous thoughts and independent actions?

  • The precedence effect was clearly demonstrated in the office kitchen of oneBritish university. Employees paid for tea or coffee drunk during the day,putting money in a common piggy bank. Near the piggy bank hung a list of proposed prices.One day, a photograph was posted above a price list without warning or explanation. ATfor ten weeks, the photograph was updated every week: it was either flowers oreyes looking at the observer. No one discussed the new design, but the amount of contributionshas changed significantly.In the first week of the experiment, tea and coffee drinkers are watchedeyes wide open and the average contribution is 70 pence per liter of milk. Seconda week the photo shows flowers and the average down payment drops to 15 pence. Trendis saved. On average, in the "eye" weeks, contributions were almost three times higher than in"floral". Obviously a purely symbolic reminder that they are being watched,

    encouraged people to behave more decently. It is quite obvious that all this happens unconsciously.

    I associate priming with such phenomena as the "Broken Window Effect", "Cargo Cult" and even the "Placebo Effect" and all this merges together. What can be learned from this? Need tunes in a disciplined manner and the implementation of technology and all the rules.


Later, Kahneman's experiments on priming were tried to be reproduced, and this did not succeed. And he publicly admitted his mistake:

What can be said here? Here he is a real scientist. He did not deny, he admitted mistakes. No charlatan will ever admit his mistakes in his life.

Examples of how people misunderstand probability are:

The main reason why people misunderstand probability is that they think it is the same as "plausibility", even though they are different things.


  • I have recently questioned my long-standing impression that the maritalinfidelity is more common among politicians than among doctors or lawyers. In my timeI even came up with explanations for this "fact", including the attraction of power andthe temptations of living away from home. Ultimately, I realized thatthe misdeeds of politicians are reported much more frequently than the misdeeds of lawyers and doctors. Myintuitive impression could be formed solely under the influence of the topics chosenjournalists for reporting, and because of a tendency to rely on the availability heuristic.

  • Among players, coaches andThere is a belief among fans that sometimes players have a “light hand”. hold back fromsuch a conclusion is impossible: if a player scores three or four balls in a row, there isthe causal belief that he will perform better than others. Both teams are adjustingunder such a decision: “ours” often give a pass to a lucky player, and the defense of “strangers” triesblock it. The analysis of thousands of sequences of throws led to a disappointingconclusion: in professional basketball there are no “light hand” shots - not withsites, nor from the penalty area. Of course, some players are more accurate than others, butthe succession of hits and misses satisfies all tests foraccident. Everything else is the invention of observers who tend to find order.and causality in random events. "Light hand" - a common cognitive illusion.

  • One of the most experienced instructors inThe group responded with his own speech:“I repeatedly praised the cadets for their clean performanceaerobatics figures. During the next attempt to execute the same figure, theycope worse. And when I scold them for bad performance, then usually the next timethey come out better. So please don't tell us that promotion works, butpunishment - no, because everything is just the opposite.Suddenly, in a joyful moment of insight, I saw a new statistical principle,who taught for many years. The instructor was right - and at the same time completelywrong! He astutely noted that behind the cases when he praised the execution of a maneuver, withdisappointments were likely to follow, and punishments were followed by improvements. Howeverhis conclusion about the effectiveness of rewards and punishments turned out to be completelywrong. The instructor observed the regression to the mean effect due torandom fluctuations in performance. Naturally, only those whoperformed maneuvers much better than average. But, probably, the cadet on this attempt is simplylucky, and thus the next attempt would be worse regardless of being praisedhim or not. And vice versa: the instructor scolded the cadet if he performed the task in an unusual way.bad, and therefore would make the next attempt better, regardless of the actions instructor.

  • In one experiment, the subjects, after readingabout "a disease that kills 1286 people out of 10,000" considered it more dangerous than thosewho were told of "a disease that would kill 24.14% of the population". First illnessseems more dangerous than the second, although the risk of death in the first case is half that! neglectthe denominator becomes even clearer if the statistics are reversed: a disease thatkills 1286 people out of 10,000 seems more dangerous than one that kills 24.4 people out of 100.The effect is likely to weaken or disappear if subjects are asked to compare twoformulations are a task for System 2. However, life is an intercategoricalAn experiment in which only one formulation is proposed each time. Onlyan exceptionally active System 2 is capable of formulating alternative formulationsmet, and push to the realization that they cause a different reaction.

  • Two passionate fans are going to drive 40 miles to seebasketball match. One has already bought a ticket; the second was just going when he got the ticket as a gift from a friend. On the day of the game, a possible snow storm was announced. Which of the fans would you be more willing to risk a blizzard to watch the game?The answer comes instantly: we know that a fan who has paid for a ticket is more likely todare to go.Whichever way they get the ticket, both willdisappointed - but the balance will be "negative" for the one who paid for the ticket and nowruns the risk of being left without money and without a game. Because it's worse for this person to stay at home,he has more incentive to watch the match and is more likely to go through the blizzard. itimplied calculation of emotional balance - System 1 performs such calculations withoutreflections. Standard economic theory does not recognize the emotions that peopleassociated with the state of their mental account. Ekon understands that the ticket has already been paid for, andmoney will not be returned. The ticket price is non-refundable, and the economy will not even think about whether he boughtticket or received from a friend.

  • Experienced forensic psychologists and psychiatrists are also susceptible to these effects whendifferent formats for expressing risk. In one experiment, professionals evaluated,how safe is it to release a patient from a psychiatric hospital, a certain Jones,followed by cases of violence. The same statistics were submitted by different ways:

    For patients like Mr. Jones, there is a 10% chance of committing repeated violent action in the first months after leaving the hospital.
    Out of a hundred patients like Mr. Jones, ten commit violentactions in the first months after leaving the hospital.

    Professionals who were presented with information in a frequency format almost doubled were more likely to reject the possibility of being discharged (41% compared to 21% of respondents whoinformation was given in a probabilistic format). A more vivid description leads to the fact that for theprobabilities give more weight to the solution.


  • The idea that the speed, brilliance, and ease of imagination affect the meanings of decisions,has been confirmed in numerous studies. Participants of one well-known experienceThey offered to choose one of the two vessels and get a ball from there. Red balloonswere considered prize money. Wherein:

    Your chances of winning would be 10% in the case of vessel A and 8% in the case of vessel B, so the correct answer seems to be simple. In fact, it turned outotherwise: 30-40% of the subjects chose a vessel with a large number of winning balls,thus preferring a smaller chance of winning.

    The results of the experiment illustrate the property of System 1 superficially process data.If your attentionattracted winning balls, you will no longer appreciate the amount of unwinnable. The neglect of the denominator is affected by the brightness of the created image - in anycase i ( Kahneman) experienced it myself. Imagining a smaller vessel, I see a lone reda ball on a blurry white background, and when I think about more, I see eight prize ballsagainst the same vague background, which inspires more hope.


  • A figurative and vivid representation of the result, regardless ofits emotional coloring, reduces the role of probability in assessing the uncertainperspectives. This hypothesis suggests (and I believe it) that the introduction of insignificant,but vivid details to the final monetary amount also makes calculations difficult. For example,Compare your cash equivalents to the following outcome events:

    21% chance of getting $59 next Monday;

    21% chance of getting a large blue cardboard envelope with $59 in it on Monday morning.

    The new hypothesis is that in the second case the susceptibility to probability will be smaller, since the mention of the blue envelope evokes a fuller image, rather than an abstract reference to money. You create an event in your mind and experience a vivid vision of the outcome, even knowing that its probability is low. Cognitive ease alsocontributes to the effect of certainty: if you vividly imagineevent, the probability that it will not occur, also appears to be alive, and thereforegets overweight.


  • The most important studies on availability distortions have been conducted by our friends inEugene, where Paul Slovik and his longtime collaborator Sarah Lichtenstein were joined by ourformer student Baruch Fischhoff. They conducted groundbreaking research on perceptionrisks, including polling, which is a classic example of accessibility bias.Subjects were asked to consider pairs of causes of death: diabetes and asthma or stroke andaccidents. For each pair, it was required to indicate the more common cause and evaluaterelationship between two frequencies. The scores obtained were compared with the statisticstime. Here is an example of the results:

    Strokes are twice as likely to die from accidents, but 80% of respondents considered death by accident more likely.

    Tornadoes have been cited as a more common cause of death than asthma, although 20 times more people die from asthma.

    Death from a lightning strike was considered less likely than death from botulism, although fatal lightning strikes happen 52 times more often.

    Illness and accident were named equally likely cause of death, although the disease is 18 times more likely to result in death.

    Death from an accident was found to be 300 times more common than death from diabetes, although the actual ratio is 1:4.

    The lesson is clear: estimates of causes of death are distorted by media reports. reports like usually lean towards unusual and tragic events. media not only form the interests of the public, but they themselves fall under their influence. Journalists cannot ignore the public's demands for detailed coverage of certain topics and pointsvision. Unusual events (such as botulism) attract a disproportionate amount ofattention and therefore appear to be more frequent than in reality. Peace inour heads are not an accurate reflection of reality, our estimates of the frequency of events are distortedprevalence and emotional intensity of the information surrounding us.


There are a lot of different topics covered in the book: the anchoring effect, replacing a complex question with a simpler one, accessibility bias, regression to the mean, etc., but they are difficult to summarize, they need to be read.

halo effect:

Starting from pages 188 to 197, the author goes through books like "The Tao of Toyota" in a very interesting and critical way.
In principle, I very much agree with Kahneman's arguments here. The principles may be good, but the historical management decisions described in the book are not. However, the book that holds this beat of his is . But, of course, does not withstand Kahneman's criticism (subjective opinion). Kahneman does not criticize it directly (because it had not yet been published), but he criticizes all books of this genre and hits the mark.

Illusion of significance:

Kahneman worked as a psychologist in the army. He selected good cadets for the officers' school. Those who were supposed to become good officers:

One such test, called "Combat without commanders", was performed on an obstacle course. To pass the test, eight candidates were selected, unfamiliar with each other, in uniforms without shoulder straps and other insignia. The candidates were given the task: to pick up a long log from the ground and bring it to a wall as high as a man. Then the whole group was required to climb over the wall without touching the log or the ground or the wall. Participants were also forbidden to touch the wall. If any of the rules were violated, the subjects had to report it and start the task again. The problem was solved in different ways. As a rule, the team either sent several people over the wall along a log tilted like a giant fishing rod, or the soldiers climbed onto the shoulders of their comrades and jumped over the wall. The latter had to climb up the sloping log while being held by the others, and then jump off to the ground. Violations often occurred at this stage, which forced the team to start from scratch. While observing the experiment, my colleague and I noted who took on the command of the “operation”, who tried to take on the role of leader, but was rejected by how much each soldier tried to ensure the success of the group. We defined the stubborn and the submissive performers, quick-tempered and patient, persistent and lazy.“Every few months they received an answer from the officer school from the officer school, where the commanders described the opinion of how certain cadets coped with their training. Each time, their forecasts turned out to be no more accurate than guessing on a tossed coin. They understood the inefficiency of this method?

I realized one thing: I would like to work as a psychologist in the army.

Intuition and formulas - who wins?

Princeton economist and wine connoisseur Orly Ashenfelter has prepared a compelling demonstration of the superiority of mere statistics over the opinion of world-famous experts. Aschenfelter wanted to predict the price of Bordeaux from the information available in the year of the crop. This is an important point as the wine takes several years to mature. In addition, prices vary greatly depending on aging - prices for wines from the same vineyard, bottled at intervals of 12 months, can differ by dozens of times, or even more. The ability to predict these changes is important as investors invest in fine wines as works of art, in the hope that they will appreciate in value over the years. It is well known that the taste of vintage wine is affected only by changes in the weather during the ripening period of the grapes. The best wines are made in years when summers are dry and warm (Bordeaux winemakers should pay tribute to the greenhouse effect). A rainy spring is another favorable factor, since it increases the yield of grapes without affecting the quality. Ashenfelter converted all this information into a statistical formula that predicts price changes (for wine of a particular producer and a particular bottle) according to three meteorological indicators: the average temperature for the summer, the amount of precipitation during the harvest period, and for the previous winter. His formula gives an accurate price forecast for the coming years and even decades. Moreover, if the calculations are based on current prices for young wine, the forecast will be less accurate. This example of the "Meehl scheme" challenges both the ability of the experts whose opinion sets the early price, and the whole economic theory that prices should reflect all available information, including weather conditions. Ashenfelter's formula gives an extremely accurate result - the correlation between the present and the predicted price is above 0.90.

Intravital epicrisis

The procedure is simple - if the organization is on the verge of an important decision, but has not yet committed itself to fulfilling it, those initiated into the plan should be called to a meeting and announced to them: “Imagine that you are in the future. We have implemented the plan as it stands. The consequences were catastrophic. We ask you to briefly describe the history of the disaster in 5-10 minutes - how it all happened. Life epicrisis has two major advantages: it allows you to move away from the stereotyped groupthink that strikes many teams as soon as they come to a decision together, and it pushes you to think in this direction.


Conclusions and practical benefits of the book:


  • Increasing the level of erudition and general sanity. Some books simply increase the level of mental power in general, although it can be difficult to say in what way;

  • Additional argumentation in discussions in which from time to time you have to participate at work;

  • The topic of forecasting the future based on formulas and statistical data is very much disclosed. An occasion to reflect and apply in work. The specific mathematical apparatus is described in other books. For example "Statistical Process Control" by David Chambers;

  • Understanding the need that nothing needs to be memorized, everything needs to be written down. Carry a notebook with you. Keeping things in mind that need to be remembered reduces your mental capacity. You juggle these tasks, remembering one or the other (so as not to forget) instead of thinking.

  • Periodically ask yourself the question "What brain did I think now? Fast? Or smart?";

  • Periodically ask yourself the question "Do I have enough information to make a decision?";

  • Think less where it is not necessary, so as not to tire out your slow system. It will still be useful to you, but there will be nothing to think about (I myself saw a person in a state when there is nothing to think about: at the end of the working day, I asked a person with a very high IQ the task "A ball and a baseball bat together cost 1 dollar 10 cents. The bat costs a dollar is more expensive than a ball. How much does a ball cost?" than knocked it out of condition for as much as 10 minutes. A person could not solve this problem for 10 minutes and began to calculate the answer on paper. That is, a person would be happy to turn on his slow system here, but there the server is already lying around and not responding When I asked the same problem to another person on a day off, he did the math in his head and gave the answer in 5 seconds.

To summarize:

The difficulty lies in the fact that in order for everything described in the book (which would be very useful), it is not enough to read and understand the book. Here you need to thoroughly rework your habits, transfer everything to the level of automatism, which can only be achieved by many years of training and self-control (maximum at the initial stage and with a decrease in it as you get used to it). It's like comparing watching boxing on TV and going to a boxing gym for years. Seeing, understanding, realizing and understanding everything (on TV) is not at all the same as doing it yourself. That's what matters most when you need to win a duel. And at the same time, there is a strong feeling that a person who is clearly aware of all the described features of thinking, and hourly makes decisions based on his fuller understanding of reality, becomes head and shoulders above in relation to mental development compared to the average person.


Ratings:

Increase in general outlook: 5/5

Practical use: 4/5

Drive while reading: 5/5


Subscribe thinker!

Daniel Kahneman

Think slow... decide fast

In memory of Amos Tversky

Introduction

Perhaps every author thinks about where his book can be useful to readers. Mine will be useful at the notorious office cooler, where they gossip and exchange news. I hope to diversify the set of words that describe the judgments and choices of others, new company policies, or colleagues' investment decisions. Why pay attention to gossip? Then, that finding and naming other people's mistakes is much easier and more pleasant than admitting your own. It is always difficult to question your own desires and beliefs, especially at the right time, but someone else's competent opinion can be useful. We involuntarily expect friends and colleagues to evaluate our decisions, and therefore the quality and content of the expected evaluations matter. The need to gossip intelligently is a powerful motivator for serious self-criticism, even more powerful than making a New Year's resolution to yourself to make better decisions at work and at home.

A good diagnostician collects many labels that link the idea of ​​a disease to its symptoms, possible causes, previous events, its course and consequences, and ways to cure or alleviate it. Learning the language of medicine is an integral part of learning medicine itself. A deeper understanding of judgments and choices requires an extended - in comparison with everyday use - vocabulary. Reasonable gossip is based on the fact that most of the mistakes people make according to certain patterns. Such systematic errors, called biases, occur predictably in the same circumstances. For example, audiences tend to rate an attractive and confident speaker more favorably. This reaction was called the "halo effect", which made it predictable, recognizable and understandable.

You can usually tell what you're thinking. The process of thinking seems clear: one conscious thought naturally causes the next. But the mind doesn't just work that way; moreover, it basically works differently. Most impressions and thoughts arise in the mind in a way unknown to you. It is impossible to trace how you came to believe that there was a lamp on the table in front of you, how you detected a slight annoyance in your wife's voice during a telephone conversation, or how you managed to avoid an accident on the road before realizing the danger. The mental work that leads to impressions, hunches, and many decisions usually goes unnoticed.

Intuition errors are discussed in detail in this book. This is not at all an attempt to defame the human mind - after all, for example, the discussion of diseases in medical texts in no way negates good health. We are healthy most of the time, and our actions and judgments are predominantly appropriate to the situation. As we go through life, we allow ourselves to be guided by impressions and feelings, and our confidence in our own intuition is usually justified. But not always. Often we are confident in ourselves, even when we are wrong, but an objective observer easily notices our mistakes.

Therefore, I hope that my book will help to improve the ability to recognize and understand the errors of judgment and choice - first in others, and eventually in ourselves - by providing the reader with a rich and accurate language to describe them. In some cases, correctly diagnosing the problem will prompt remedial action that will reduce the harm caused by bad judgments and bad decisions.

This book presents my current understanding of value judgment and decision making, influenced by discoveries in psychology over the past decades. The main ideas presented here came to me in 1969 when I invited a colleague to speak at a seminar held by the Department of Psychology at the Hebrew University of Jerusalem. At that time, Amos Tversky was a rising star in decision-making research - indeed, in all areas of his scientific activity - so I had no doubt that it would be interesting. Intelligent, sociable and charismatic, Amos had an excellent memory for jokes and anecdotes, skillfully applying them to explain important issues. There was never a dull moment around him. He was then thirty-two, and I was thirty-five.

Amos told the students about a research program at the University of Michigan designed to answer the question, "Does a person have an intuitive understanding of statistics?" Everything was known about grammar: four-year-old children in speech follow grammatical rules, having no idea of ​​their existence. But do people have a similar intuition for the rules of statistics? Amos argued that the answer is "yes", with certain qualifications. We had a heated discussion at the seminar and came to the conclusion that it would be more correct to answer “no”, with certain reservations.

After that, Amos and I decided that intuitive statistics was a great topic for joint research. On the same Friday we met at the Cafe Rimon, where the Jerusalem bohemians and professors like to gather, and drew up a plan to study the statistical intuition of serious researchers. In the seminar, we came to the conclusion that our own intuition is unreliable. Over the years of teaching and using statistics in our work, we have never acquired an intuitive sense of the correctness of statistical results obtained on small samples. Our subjective judgments turned out to be biased: we were too willing to believe studies that did not have enough evidence, and we did not select enough examples for our own research. We wanted to find out if other researchers suffer from the same disease.

We have prepared a questionnaire with realistic statistical problems that arise during the research. At a conference of the Society for Mathematical Psychology, Amos distributed questionnaires to experts, among whom were the authors of two textbooks on statistics. As we expected, our fellow experts greatly exaggerated the likelihood that the initial result of the experiment would be successfully repeated on a small sample. In addition, the fictitious student received nasty advice about the number of observations she needed. As it turned out, even statisticians are bad with statistical intuition.

While we were writing the article, it turned out that Amos and I enjoy working together. Amos was an incorrigible joker, in his presence I also joked, and we worked with him and at the same time had fun for hours on end. The pleasure of working together increased our commitment - it's much easier to achieve excellence if you're not bored. But the most important thing was, perhaps, that we did not abuse criticism, although both loved to argue and look for mistakes, Amos even more than I did. Nevertheless, over the long years of our cooperation, we have never dismissed a single assumption of each other on the move. It was also gratifying that Amos often understood the meaning of my vague ideas better than I did. He thought more logically, was guided by theory and always adhered to the intended path. I relied more on intuition, based on the psychology of perception - we got a lot of ideas from this area. The similarity of our characters ensured mutual understanding, and our differences helped to surprise each other. We ended up spending most of our work time together, often taking long walks. Fourteen years of cooperation defined our lives, and during these years we achieved the best results in our entire career.

The procedure we developed has been followed for many years. Research was conducted in the form of discussions, where we came up with questions and considered our intuitive answers together. Each question was a small experiment, and we did many of them in a day. We were not looking for the only correct answer to the given statistical questions. Our goal was to recognize and analyze the intuitive answer that first came to mind, that we wanted to give, even if we knew that it was wrong. We decided—correctly, as it turned out—that the intuitive response that occurred to both of us would occur to many others, and so it would be easy to demonstrate the impact of such an intuitive response on value judgments.

One day, to our mutual delight, we discovered that we had exactly the same stupid ideas about what several babies we knew would become. We identified a three-year-old argumentative lawyer, a boring professor, a sensitive and overly curious psychotherapist. We understood the absurdity of these predictions, but we still liked them. It was obvious that our intuition was based on the similarity of each of the children with the cultural stereotype of the profession. This fun exercise helped us develop a theory about the role that similarity plays in prediction. Then we tested and developed this theory with many experiments like the following.

To answer this question, consider that Steve was randomly selected from a representative sample:

...

Someone describes their neighbor: “Steve is very shy and unsociable, always ready to help, but has little interest in others and reality. He is quiet and tidy, likes order and consistency, and is very attentive to detail.” What is Steve more likely to work as: a farmer or a librarian?

Everyone immediately notes Steve's resemblance to a typical librarian, but almost always ignores equally important statistical considerations. Did you remember that for every male librarian in the United States, there are more than 20 farmers? There are so many farmers that the "quiet and tidy" ones are almost certain to be driving tractors rather than sitting at the librarian's desk. Yet we found that the participants in our experiments ignored the statistical facts and relied solely on similarity. We hypothesized that subjects used similarity as a simplifying heuristic (roughly, a rule of thumb) to more easily arrive at a complex value judgment. The reliance on heuristics, in turn, led to predictive biases (permanent errors) in predictions.

Psychologist, professor, Nobel laureate Daniel Kahneman is the author of the best-selling book Think Slowly... Decide Fast. He is engaged in the study of human thinking, the features of how a person makes decisions, what he focuses on, what role the unconscious plays in this.

It is generally accepted that man is a rational being. At the same time, if he acts irrationally, then it is assumed that this happens under the influence of emotions. The author of this book expresses his opinion on this matter. He says that thinking errors can be caused by thinking itself, and not by the emotions experienced.

Daniel Kahneman distinguishes two types of thinking. One of them can be considered fast, when a person spontaneously makes a decision without wasting time. The second type is slower, such thinking is connected when a person carefully considers his actions, solves a problem, looks for an answer to a difficult question, calculating all the options. Thus, errors of thinking may lie precisely in the fact that the wrong type of thinking is used. The professor also says that the work of the first type has less energy than the work of the second. The organism tends to conserve energy, and therefore is likely to accept the option proposed by the first system of thinking, without wasting energy on careful thinking.

For example, when a person sees something familiar, he begins to think that it is true and even safe. Therefore, the best way to convince people is considered to be constant repetition, even if it is not true. The first impression works in exactly the same way, when the first system of thinking has already made a conclusion - good or bad, then the second system does not consider it necessary to connect to the work, because the answer is already there.

The book contains a lot of information. All this is not only very interesting, but can also be useful in work and everyday life. The professor teaches how to control your thinking and make the right decisions.

On our website you can download the book "Think Slowly... Decide Fast" by Daniel Kahneman for free and without registration in fb2, rtf, epub, pdf, txt format, read the book online or buy the book in the online store.

    Rated the book

    Tearfully asked a friend to bring a book by Daniel Kahneman - one of the leading psychologists, Nobel Prize winner in economics, one of the most influential people in the world of finance, from Almaty.

    I started reading and on page 60 I thought, did my intuition for choosing books let me down? But, until that time, the selected books always hit the right point in my head. When choosing books, I never follow the fashion that screams “Bestseller!!!” or a “delicious” review, even one like this book: “If you can only read one book this year, read this one!” writes the Boston Globe on the back of Think Slow… Decide Fast. And I trust proven people, that is, those who are close in spirit, their advice.

    So what did I do wrong? Relations with Mr. Kahneman did not work out right away, as soon as System 1, System 2 appeared on the pages. My both systems did not accept logical tasks, exercises for attentiveness, intelligence and scientific explanations.

    But then, then the real magic began. “Most of us see the world as friendlier, our own personalities more agreeable, and our goals more achievable than they really are. We also tend to exaggerate our own ability to foresee the future, which makes us overconfident. When it comes to the consequences of cognitive biases, the optimistic bias is perhaps the one that has the strongest impact on decision making. It can be both good luck and bad luck, so if you are an optimist by nature, you should be on your guard.”

    I think Kahneman is aware that he is difficult to read. But he's good, really good. It pulls the reader into the jungle of your irrational actions, wrong decisions, into the very depths of your brain. More precisely, in both types of thinking that are at your disposal: "fast" (automatic, instinctive, emotional) and "slow" (rational and logical). Perhaps you often overestimate your personality? Or maybe you are an incorrigible optimist? Have you ever hired a person on the basis of “I liked him right away”? Kahneman, using scientific research, shows how mistakes in planning can be avoided. And not only! Kahneman teaches you how to avoid retrospective distortions when the illusions of the past control your future.

    True, after reading the book, many sales of MIF publications will fall. Because Kahneman teaches to be skeptical of all manuals and management practices, even my favorite Jim Collins made this list. After all, stories of the rise and fall of companies strike a chord with the reader. They offer what our minds yearn for: a simple plot. A plot that creates the illusion of understanding, teaching the gullible reader a lesson of transient value.
    The book teaches you to define your subjective beliefs, when you may not be aware of the limits of your own professional skills, to avoid the temptation of predicting.

    It is recommended not only for people who are professionally involved in psychology, but also for entrepreneurs, managers, recruiters, insurers, traders, experts, and anyone who wants to take a fresh look at themselves, understand what drives them, what controls their actions.

    Rated the book

    This book is a clear leader in the number of quotes and information that I wrote out for future study. There are dozens of pages of notes and thoughts on the topics raised in this book - and I still chose only the most important and interesting for myself. I think if I wrote out all the new important things from this book in general, I would get a small volume the size of some average "book on how to change your life for lazy dummies." True, there would be a thousand times more benefit from this "squeeze".

    I'm not afraid of this recommendation, but I think this book is worth reading for all people who have ever wondered why seemingly logical economic theories fail. Or those who know what a "player's error" is. And in general, to everyone who is interested in how our consciousness works and why we allow certain methods of manipulation. All these are just small details of the big problem that the author tells us about in this book.

    The main idea is that a person has two completely different systems of thinking - one fast and easily trained system, which we usually call intuition. The second system is slow, does not like work, but it is able to teach the first system various things. It is she who allows you to solve complex problems that intuition is not able to cope with. We would call it our mind. But he is lazy, and usually prefers not to interfere in all these "showdowns with reality" in vain, relying on much easier ways to find a solution.

    So it turns out that in many things we tend to rely on the decisions that our intuition generates for us - after all, this does not require special costs. And so we trust her in a variety of issues, without even being particularly aware of this. Yes, she copes well with most everyday issues, such as whether to take an umbrella, how much you want to drink now and the like. But there are entire areas where intuition is trivially wrong. She tends to substitute one concept for another, is easily confused and relies too much on past experience, not paying attention to the difference in circumstances. And knowing exactly how you can deceive this system, you can do amazing things, both with one person and with entire groups.

    Why is it that in one European country the number of donors is almost 90% of the adult population, while in a neighboring country it is only some miserable 4%? What simple solution can achieve such a difference? How can a person be made to believe that a part is greater than the whole? Or, for example, to force someone in their right mind to prefer a long painful procedure, similar in torment, but shorter? You will not believe, but this is really possible. And for the most part, such events have a completely logical and understandable explanation, which they will provide us with.

    It is clear that it was not without drawbacks. The book is large and long - about a thousand pages. It reads quite unevenly in places. That is, it seems to be quite well written, with various examples and detailed explanations. But sometimes the author, it seems to me, gets too carried away with explanations of obvious things, repetitions and references to his own experience.

    But do not worry, this is quite easy to do - you just have to join this flow of information and arm yourself with something to record. After that, you don’t pay attention to all the flaws ... at least until the moment when the time comes to emerge from the story. Until the next dive. There, again, you spend some time on the immersion process. But on the other hand, against the background of informativeness and food for thought, it all looks like petty nit-picking. This is a really great and important book that should be read by anyone who ... no, I would even say that just everyone. Naturally, if you are not afraid of large amounts of information.

  1. Rated the book

    Before I started listening to the audiobook Think Slow... Decide Fast, I had absolutely no idea about a man named Daniel Kahneman. It's a pity... Otherwise, the book would have been read, not listened to (the story of my "struggle" with a scientific audiobook is told here).

    Since I started my review-reasoning with the author of the book, then, perhaps, it is worth finishing the thought. Daniel Kahneman won the 2002 Nobel Prize in Economics "for the use of psychological methods in economics". The main merit of the scientist is that he established the cognitive basis for common human fallacies that stem from heuristics and biases. Impressive?

    But back to the book.

    For me, Kahneman's research is of interest from the point of view of the theory and practice of adult learning. Analyzing the actions and actions of people, the scientist comes to the conclusion that any person has two types of thinking. The first type is quick or intuitive thinking (Kahneman calls it "System 1"). The second is slow or rational thinking (“System 2”).

    What is the difference between System 1 and System 2?

    Intuitive thinking is an instant reaction, therefore it often leads to errors of judgment. Rational thinking, unlike System 1, requires serious intellectual operations. According to Kahneman, the human mind is lazy, so turning on System 2 requires additional cognitive effort, which naturally requires a certain amount of energy expenditure. Therefore, most often in ordinary (everyday) situations, a person turns on a fast type of thinking. But just learning, creating a new intellectual product or solving a complex problem triggers a slow (and, as mentioned above, energy-consuming) type of thinking.

    If System 2 is busy, System 1 influences behavior more than usual, and she has a sweet tooth.

    Another difference between the two systems is that intuitive thinking is the basis of feelings and impressions, which is why System 1 is called emotional. Slow thinking is associated with such logical operations as analysis, synthesis, abstraction, generalization, etc.

    System 1 is categorically immune to the quantity and quality of information on which impressions and hunches are based.

    Each system is responsible for performing certain functions. The human brain is designed in such a way that it is not able to control any situation around and inside itself every minute. Perhaps I would add that intuitive thinking can also be considered as a kind of protective mechanism of the personality. And rational thinking, for example, is responsible for self-control, thanks to which a person is critical of first impressions, which provides opposition to illusions and hasty conclusions.

    By the way, the simplest illustrative example of the operation of systems can be demonstrated using the optical-geometric illusion of Muller-Lyer, in which the length of the lines seems to be different depending on the direction of the arrow. So, at first glance, it seems to a person that the segment framed by “points” seems to be shorter than the segment framed by “tail” arrows (System 1). However, as soon as we measure the length of the segments, it immediately becomes clear that they are the same (System 2).

    One of the main tasks of D. Kahneman is that, citing countless examples of his own scientific research, he teaches how to cope with intuitive predictions (false judgments, estimates, premonitions), for which System 1 is responsible. And although there are no the specifically described techniques for turning on System 2, yet the author easily convinces the reader that only with the help of additional efforts, including concentration, any activity can become more effective.

    How do we speak in Russia? Oh yes! “You can’t even catch a fish from a pond without labor.” So any success is, first of all, the result of the application of serious intellectual efforts, and not intuition and chance.

    Those who avoid the sin of intellectual laziness can be called "involved." They are more attentive, more intellectually active, less likely to be satisfied with superficially attractive answers, more skeptical of their intuition.

The actions and actions of a person are determined by his thoughts. But is a person able to control his constantly? Often people commit irrational acts and accept. In addition, people have two thought systems - slow and fast.

Daniel Kahneman talks about these and other features of thinking in his book Think Slowly... Decide Fast. The information presented in this book may seem paradoxical and even somewhat shocking, but despite this, the book is written in a simple, easy and very interesting language.

Kahneman's work is appreciated by many people, and its practical value is simply invaluable. After reading this book, you will be able to understand many thought mechanisms and learn to accept in any area of ​​life.

About Daniel Kahneman

Daniel Kahneman- Israeli-American, Nobel Laureate in Economics. He made a significant contribution to the development of psychological economic theory, which explains, from the standpoint of economics and irrationality, people's attitude to risk in the process of making decisions and managing their own behavior. Kahneman is also known for the fact that, together with a group of other authors, he conducted a large-scale study of the cognitive basis in common human delusions, the result of which was the book “Decision Making under Uncertainty: Rules and Prejudices” (D. Kahneman, P. Slovik, A. Tversky).

Summary of the book "Think Slowly... Decide Fast"

The book consists of an introduction and several introductory chapters, two large parts and notes. Below we present some interesting ideas of the author, which he describes in detail in his work.

Introduction

As a rule, a person judges what is happening around him, guided by his own, and then the truth remains with him. But there are situations when he is confident in himself, even when he is wrong. And in order to avoid many mistakes, a person needs to understand what the errors of judgments and choices are in others and in himself. And in many situations, the correct interpretation of the problem helps to reduce the harm that follows bad judgments and decisions.

Two Modes of Thinking

Humans have two systems of thought - "fast" thinking, which can be called emotional, instinctive or automatic, and "slow" thinking, which is also rational.

The property of fast thinking is an instant reaction, and for slow thinking it is necessary to allocate energy and make mental efforts. Fast thinking is the basis of feelings and impressions, while slow thinking is associated with such sensations as concentration, choice, activity, etc.

The conflict of two minds and self-control

Sometimes a person feels a conflict between what he should do and the reaction that has become an obstacle to doing it. And the whole point here is self-control, for which slow thinking is responsible. There is also a difference between the impressions received and the beliefs held, and in order to be able to resist the illusion, one must learn not to trust first impressions. Even the most plausible answer is not always correct. Quick thinking works automatically, and you can’t turn it off by magic. But you can make sure that slow thinking is always on the alert.

Association mechanism

The human mind itself establishes causal relationships between different words, instantly causing a corresponding reaction. Words are associated with memories, which are associated with feelings that form bodily reactions. Fast thinking controls many of the actions of people, gives impressions, forms beliefs and serves as the basis for choice. But it is also a source of both accurate judgments and regular mental errors.

Illusion of truth

If you force a person to repeat something often, he will believe even in fables, because. distinguishing between the feeling of something familiar and the truth can be difficult. Even one familiar phrase in the entire false statement may be enough to believe in it. On the other hand, if you, for example, write something true, this does not mean that they will believe you. To be believed, you can apply the illusion of lightness, which is achieved by increasing the readability of the text. In addition, the text can be made more memorable and even given the form of poetry - this will make it psychologically easy.

The mechanism of jumping to conclusions

Jumping to conclusions can be effective if they are adequate, if the cost of making a mistake is acceptable, and if haste saves time and effort. But in unfamiliar situations with high stakes and limited time, drawing such conclusions is highly inefficient. To avoid mistakes, you need to turn on slow thinking, which will help track the veracity of the message, get rid of it and make your judgments independent.

Anchor effect

The anchoring effect occurs when a person encounters an arbitrary number before evaluating an unknown value. For example, the same house will be perceived differently, depending on the amount requested for its rent. The anchoring effect will appear regardless of the number that can serve as a solution to the problem (for example, the amount that a person is willing to pay for a house). Although this effect may seem reasonable in some situations, in most cases it makes the person suggestible and allows their gullibility to be exploited.

Availability

Accessibility should be understood as a person's susceptibility to information coming from the media. For example, if the news shows that a plane has crashed or an accident has occurred, people begin to buy insurance, experience fear of flying, and so on. People who are guided by fast thinking are subject to such distortions to a greater extent than those who use slow thinking.

How to deal with intuitive predictions

Quite often, intuition and quick thinking become the cause - premonitions, estimates, assumptions, etc. Many choices can be wrong and distorted for this reason, especially if the result of a favorable outcome of some choice promises large benefits. If you want to stop fooling yourself by relying on lightning-fast forecasts, you need to activate slow thinking.

Re-evaluation of rare events

The occurrence of some extraordinary event is usually accompanied by a stream of available data created by the media. The reason for this is uncontrolled, automatic and associative emotional arousal (which is why, by the way, terrorism is one of the most effective means of social influence, since emotional arousal is the cause of defensive behavior). Despite the fact that slow thinking gives a person the understanding that the chances of realizing a “terrible” event are extremely small, he cannot get rid of discomfort.

People often go overboard in thinking that rare events are bound to happen, and this overestimation of unlikely events affects everything from emotional state to self.

Conclusion about two systems of thought

In Daniel Kahneman's book Think Slow... Decide Fast, the way the human brain works is described as a complex interaction between two components - slow and fast thinking. After reading this book, you will have mastered the skill of anticipating the actions of these two components in all sorts of situations.

Slow thinking has a formative effect on judgments and choices, evaluates the ideas and feelings that arise through fast thinking. But slow thinking should not be considered only a defender of fast thinking, because in a huge number of situations it effectively blocks unnecessary impulses and stupid thoughts, and with the help it makes any activity more efficient and allows you to make the right choices.

To make as few mistakes as possible in your life, you must first of all remember that nothing can be achieved without effort. Our intuition can make us self-confident and categorical, but we should not trust it blindly. It is only through deliberate mental effort that we can see situations in which there is a high probability of error. The main method that will not allow a mistake to happen is to recognize the signs of a person being in the “danger zone”, turning on slow thinking and giving him the right to evaluate and choose.