You are using an outdated browser.
Please upgrade your browser
and improve your visit to our site.
Skip Navigation

Reason, Almost

Thinking, Fast and Slow
By Daniel Kahneman
(Farrar, Straus and Giroux, 499 pp., $30)

Humans are rational animals. In addition to the rich perceptual, cognitive, and motivational systems that they share with other creatures, they have a unique way of forming beliefs and preferences that is not instinctive but deliberate. They can form beliefs by logical inference, assessment of evidence, and statistical judgments of likelihood; and they can form preferences and make choices by the principled evaluation of alternative courses of action, on the basis of their rationally formed beliefs about the likely outcomes of those alternatives. The results of all these forms of reasoning can be preserved, recorded, and passed on to others, making possible the growth of knowledge and civilization. Reason has created our world.

But reason depends on a constant supply of material from our pre-rational, animal nature—from perception, feeling, and natural desires. There is also an intermediate level of automatic judgment, some of it learned through experience, that operates more quickly than conscious reasoning and is essential for navigating the world in real time. The relation among these faculties is complicated. Even when we think we are using reason to arrive at the right answer to some factual or practical question—taking the relevant data consciously into account—our reasoning may be influenced more directly, without our knowledge, by the instinctive forces with which it coexists.

Daniel Kahneman is one of the psychologists who has done the most to advance our understanding of how this complex set of mental factors works and some of the problems that arise in the interaction among its components. His name will always be linked with that of Amos Tversky, whose early death in 1996 at the age of fifty-nine brought their long intellectual collaboration to a close. This book is dedicated to Tversky’s memory, and it is a tribute to him and an admirable account of their work together. It is also a clear, comprehensive, and often witty introduction to an interesting area of research, written by a leading contributor with exceptional expository gifts.

REASON IS traditionally divided into two types, theoretical and practical, which control the formation of beliefs and the determination of choices, respectively. It is a scarce resource in each of us. Most of the time, in most respects, we have to operate on autopilot, because we cannot spare the conscious attention to identify and weigh up the pros and cons for everything we do or think. Kahneman’s aim in his book is not just theoretical but also practical. He wants to provide us with a degree of self-understanding that will permit us to manage our thoughts and choices better, both individually and collectively. He says he cringes when his work with Tversky “is credited with demonstrating that human choices are irrational.” This may seem an odd reaction, since the pair certainly showed that humans are systematically prone to make certain mistakes. But mistakes are not irrational until one has, and fails to use, the means to avoid them. Kahneman’s aim is not to criticize human nature, but to identify features of the way human beings function that had not been recognized by influential theories of rational choice, and to suggest how we can guard against some of their untoward consequences.

The distinction in Kahneman’s title, between thinking fast and slow, corresponds to a familiar distinction between two categories of mental functioning, which he follows common psychological usage in calling System 1 and System 2, described thus: “System 1 operates automatically and quickly, with little or no effort, and no sense of voluntary control. System 2 allocates attention to the effortful mental activities that demand it, including complex computations. The operations of System 2 are often associated with the subjective experience of agency, choice, and concentration.”

System 1 governs most of our perceptions, our use of language, our navigation through the world, and our social interactions. We process so much constantly changing material from moment to moment that automatic responses are indispensable. Sometimes, however, these shortcuts give results that conscious reflection, calculation, or measurement carried out by System 2 would show to be wrong. We cannot in general displace System 1 responses by System 2 reflection—there just isn’t time; but it pays to know when System 2 should be brought into action to correct for errors.

As Kahneman observes, System 2 is lazy. It is much easier to jump to conclusions than to weigh evidence and check the validity of arguments. Kahneman’s strategy in overcoming this difficulty has two components: an empirically based account of how people actually think and some recommendations for improvement through methods that correct for the most common errors produced by System 1. As I have said, the subject also divides between the formation of beliefs and the determination of choices, between fact and value, though some important mechanisms operate in both domains.

CONSIDER, FOR EXAMPLE, the “anchoring effect,” which has been demonstrated again and again in many contexts of both belief and choice. Where someone has to estimate a quantity, or make a quantitative choice, the mention in advance of a particular number, even if it comes from a completely irrelevant source, will have a strong influence on the answer. In one of their early experiments, Kahneman and Tversky spun a wheel with one hundred numbers, rigged to stop at 10 or 65, and asked people whether the percentage of African nations among U.N. members was greater or smaller than the number that came up. They then asked them for their best guess of the correct percentage. The average estimates of those who saw 10 and 65 were 25 percent and 45 percent, respectively. The “anchoring effect”—the ratio of the two differences—is 55 percent, which is typical. In another experiment, German judges read a description of a woman who had been caught shoplifting, and after rolling a pair of dice that were loaded to result in either a 3 or a 9, they were asked, first, whether they would sentence her to more or fewer months than that number, and then what sentence they would give her. On average, those who had rolled a 9 said they would sentence her to eight months; those who rolled a 3 said five months—an anchoring effect of 50 percent.

These influences are not conscious: everyone knows the numbers produced by the wheel or the dice are irrelevant. But in cases where an answer must be partly determined by guess or intuition, the suggestive prompting of such salient inputs is very powerful. Even if you try to resist it, you may fail, as was shown in an experiment with real estate agents, whose estimates of a reasonable buying price for a house they visited and examined carefully turned out to depend significantly on what they were told was the asking price, though they all took pride in their ability to ignore it. The anchoring effect was 41 percent.

Another important mental phenomenon is substitution: asked a question that would require some conscious reasoning to answer, one answers a different question that is easier. When someone is given a description of a person and asked about the relative likelihood of his being in a certain profession, the answer will usually depend entirely on how closely the description fits the stereotype of a typical member of the profession, ignoring the important factor of the different numbers of people in different professions. This is called base-rate neglect.

A similar error is involved in one of Kahneman and Tversky’s most flagrant examples. Presented with a stereotypical description of a woman sympathetic to progressive social and political causes, most subjects thought that the probability of her being a feminist bank teller was higher than the probability of her being a bank teller—a logical impossibility. It is like believing that a Norwegian is more likely to have blond hair than to have hair.

KAHNEMAN DESCRIBES the many ways in which intuitive judgments of likelihood tend to be unreliable—often with serious consequences for legal and policy choices. Salient examples that receive media coverage or make a strong emotional impression are given evidential weight independent of how typical they are. Small samples with unusual characteristics are taken as having causal significance, when they merely reflect random variation. The Gates Foundation spent $1.7 billion to encourage smaller schools, on the basis of findings that the most successful schools were small; but the least successful schools are also small. Both facts are a statistical consequence of the principle that the larger the sample from a population of individuals in which a characteristic such as academic performance varies, the closer the sample’s average level is likely to be to that of the population as a whole; whereas the smaller the sample, the more likely it is to diverge in one direction or another from that average. But people have a thirst for instructive causal explanations of what is good or bad, and tend to postulate them in many cases where an anomalous outcome is really due to chance. This is presumably at work in the current superstition that inoculations cause autism.

Kahneman is particularly hard on the stock market as a site of illusion. The lack of correlation in performance from year to year of almost all traders, advisers, and mutual funds shows that success or failure is essentially due to luck. Yet professional investors are rewarded on the basis of those results, which are not under their control. Kahneman explains that presentation of the statistical facts makes no impression on the participants because it cannot derail the subjective sense of informed judgment that accompanies these gambles. For similar statistical reasons Kahneman is skeptical about the contributions of CEOs’ competence to the success or failure of their companies, which depends largely on factors beyond their control. “The CEO of a successful company,” he writes, “is likely to be called flexible, methodical, and decisive. Imagine that a year has passed and things have gone sour. The same executive is now described as confused, rigid, and authoritarian.” Where skill is decisive and much more important than chance, as in highly controlled accomplishments such as orthodontia, golf, or musical performance, it will show up through the persistence over time of individual differences.

Kahneman’s recommendation is that when making predictions or plans, one should try to identify a relevant class to which the case belongs, and take as a baseline the statistics of outcomes for that class, before factoring in any intuitive sense one may have about the particular case. And yet he also recognizes that unwarranted optimism in defiance of statistical evidence may have its uses as an indispensable driving force in a competitive economy, and for that matter in scientific research, even though it usually results in failed restaurants, failed inventions, and failed experiments.

SO FAR I HAVE BEEN discussing mostly errors due to System 1 in the formation of beliefs; but Kahneman’s most influential contributions, for which he received the Nobel Prize in economics, have to do with the formation of preferences and choices. He and Tversky demonstrated that rational expectations theory, a model of choice widely used by economists, was seriously inaccurate as an account of how people actually behave, and they proposed an alternative that they called prospect theory.

Most choices, and all economic choices, involve some uncertainty about their outcomes, and rational expectations theory, also called expected utility theory, describes a uniform standard for determining the rationality of choices under uncertainty, where probabilities can be assigned to different possible outcomes. The standard seems self-evident: the value of a 50 percent probability of a given outcome is half the value the outcome would have if it actually occurred, and in general the value of any choice under uncertainty is dependent on the values of its possible outcomes, multiplied by their probabilities. Rationality in decision consists in making choices on the basis of “expected value,” which means picking the alternative that maximizes the sum of the products of utility and probability for all the possible outcomes. So a 10 percent chance of $1,000 is better than a 50 percent chance of $150, an 80 percent chance of $100 plus a 20 percent chance of $10 is better than a 100 percent chance of $80, and so forth.

The principle of maximizing expected value was conceived as both a principle of what is rational and a way of predicting how cool-headed individuals will choose under conditions of uncertainty. But Kahneman and Tversky showed that people do not act in accordance with rational expectations theory—whatever may be the truth about the ideal standard of rationality. They do not care only about the value of outcomes, even discounted by probability. They care about whether the outcome is the result of a gain or a loss, and they assign greater importance to losses than to gains. Moreover, the weights they assign to outcomes are not uniformly proportional to probabilities; the relation is much more complicated.

In general, the negative trumps the positive: losses are more powerful than the corresponding gains. Most people will not accept a bet with a 50 percent chance of winning $125 and a 50 percent chance of losing $100. And for high probabilities, people are risk-averse with respect to gains and risk-seeking with respect to losses: they would choose $9,000 for sure over a 90 percent chance to win $10,000, but would prefer a 90 percent chance of losing $10,000 over losing $9,000 for sure. Yet when the probabilities are small, the preference is reversed. Most people prefer a 5 percent chance to win $10,000 over getting $500 for sure; and most people prefer a certain loss of $500 to a 5 percent chance of losing $10,000. For small probabilities, the former, risk-seeking preference is why people buy lottery tickets; and the latter, risk-averse preference is why people buy insurance. Kahneman brings in other factors as well, but these dispositions are particularly important, and he points out that they have an effect on the settlement of lawsuits, whose outcome is always uncertain: a big potential loser in a case with low probability of success will be more eager to settle than a big potential winner, so the plaintiff in a frivolous suit has a bargaining advantage over the defendant.

Another significant divergence from expected utility theory is found in the reaction to probabilities very close to zero. People tend either to ignore very small probabilities or to overweight them: a cancer risk of 0.001 percent cannot be easily distinguished from a risk of 0.00001 percent. But if even a tiny cancer risk attracts attention, it is likely to be weighted out of proportion to its probability in the choice of actions or policies.

The phenomena of loss aversion, risk aversion, and risk seeking are sufficiently robust that prospect theory is now widely recognized as an improvement over expected utility theory as a predictor of individual choices. Behavioral economics is based on the systematic use of these findings. But Kahneman remains sympathetic to expected utility as a normative standard for rational choice, since it yields better results over the long term. He urges that we control the distorting emotional influence of our natural loss aversion by rehearsing “the mantra that will get you significantly closer to economic rationality: you win a few, you lose a few.”

MORE RECENTLY, Kahneman has turned his attention to the measurement of happiness or well-being, and here again his claims take a dualistic form. He distinguishes between two selves, the experiencing self and the remembering self. In relation to an unpleasant experience such as a colonoscopy, for example: “The experiencing self is the one that answers the question: ‘Does it hurt now?’ The remembering self is the one that answers the question: ‘How was it, on the whole?’ Memories are all we get to keep from our experience of living, and the only perspective that we can adopt as we think about our lives is therefore that of the remembering self.”

It turns out that the judgments of the remembering self bear a peculiar and indirect relation to those of the experiencing self. You might think that when asked to compare two unpleasant experiences after they are over, subjects would be influenced in their evaluation by the total amount of pain in each, which depends on both the duration and the average intensity. But this is not the case. Retrospective evaluation is determined by two other factors: the level of pain experienced at the worst moment of the experience and at its end. Duration seems to have no effect, and an unpleasant experience that differs from another only by having a period of less severe pain tacked on at the end will be remembered as less bad, even though it includes more total pain. Kahneman thinks there is clearly something wrong here:

The remembering self is sometimes wrong, but it is the one that keeps score and governs what we learn from living, and it is the one that makes decisions. What we learn from the past is to maximize the qualities of our future memories, not necessarily of our future experience. This is the tyranny of the remembering self.

These misleading effects, which Kahneman calls the peak-end rule and duration neglect, apply to the evaluation of pleasant experiences as well.

But there are other, legitimate ways in which the overall evaluation of aspects of our lives, or of our lives as a whole, is not just a function of how experientially pleasant or unpleasant they are moment by moment. One of the most important influences on people’s satisfaction with their lives is what their goals in life are, and whether they have achieved them. With unintentional comedy Kahneman reports that it required empirical research to bring him to the revelation that this should be regarded as an independent type of value, rather than some kind of mistake:

In part because of these findings I have changed my mind about the definition of well-being. The goals that people set for themselves are so important to what they do and how they feel about it that an exclusive focus on experienced well-being is not tenable. We cannot hold a concept of well-being that ignores what people want.

So he now holds a hybrid view. But the fact that it required psychological studies to convince him that what people want and what they achieve, and not just how they feel, is important for the evaluation of their lives, indicates that he started out with an unbelievably myopic conception of value.

Kahneman continues to display the same blinkered hedonism when discussing the fact that those who suffer from major physical deficits, such as paraplegics and colostomy patients, do not have lower moment-to-moment experiential well-being than healthy people, though they give a much lower assessment to the quality of their lives. Kahneman regards this as an illusion:

Experience sampling shows no difference in experienced happiness between these patients and a healthy population. Yet colostomy patients would be willing to trade away years of their life for a shorter life without the colostomy. Furthermore, patients whose colostomy has been reversed remember their time in this condition as awful, and they would give up even more of their remaining life not to have to return to it. Here it appears that the remembering self is subject to a massive focusing illusion about the life that the experiencing self endures quite comfortably.

Kahneman apparently regards the overwhelming desire of these patients not to spend their lives leaking excrement into a plastic bag as a “focusing illusion,” because it produces experiential displeasure only when they think about it, which is not most of the time. He cannot imagine that the desire itself might be reasonable. Pleasure and the avoidance of pain are very important, but people care about a lot else. And to depart even further from Kahneman’s essentially utilitarian perspective: some of those things, including knowledge, freedom, and physical independence, are—dare I say it?—good in themselves, which is why people care about them.

STILL, EVEN IF one takes a more sophisticated view of the sources of value, Kahneman’s observations about how people make choices are highly relevant to public policy. This is not a work of moral or political theory, but it gives us important information about how to achieve our aims, whatever they may be, or at least how to improve the likelihood of success. If a society wants to encourage retirement savings, or organ donation at death, it can do so most effectively not by exhortation or coercion, but by setting a default position, so that people have to make a conscious choice to opt out of the pension plan or to refuse to donate their organs. (This is known as libertarian paternalism, since it is not coercive, and it is elaborated in detail by Richard Thaler and Cass R. Sunstein in their book Nudge.) The anchoring effect of the status quo, or of a neutral reference point, is enormous. But we must also be on guard against the malign influence of the status quo, which can lead us to throw good money after bad, because abandoning a failed project or investment for something more promising would require us to acknowledge the loss of what we have already wasted on it—the so-called fallacy of sunk costs.

Kahneman’s findings should not surprise anyone who has had to make many decisions quickly on the basis of disparate information. My own academic experience of evaluating large numbers of candidates for appointment or for admission to graduate school does not inspire confidence in the accuracy of such judgments or their immunity to unrecognized and irrelevant influences. Kahneman believes that the main practical lesson to take from his book is that one should cultivate the standpoint of the outside observer, or even the critical input of actual outside observers, to provide a warning when the gut responses of System 1 need to be reviewed by the more burdensome calculations and rule-governed procedures of System 2. But for any significant case, such methods have to be developed by insiders to the area of judgment.

These results leave open the question of whether progress toward greater objectivity will come primarily from collective institutional practices or from the gradual internalization of critical standards by individuals. In some domains, such as science, law, and even morality, progress depends on reflection and the attempt to identify sources of error, carried out initially by a minority; then eventually institutions and cultural transmission lead to changes in many more people’s habits of thought. Kahneman has provided a valuable service of consciousness-raising, but the design of remedies is an ongoing task.

Thomas Nagel teaches philosophy and law at New York University. His new book, Mind and Cosmos, will be published later this year by Oxford University Press. This piece appeared in the February 16, 2012 issue of the magazine.