August marks the 70th anniversary of the use of nuclear bombs in Hiroshima and Nagasaki. Even before those events, civil rights and anti-colonial activists were linking racial issues to anti-nuclear advocacy. We consider that history of opposition to the bomb from the likes of Bayard Rustin, Paul Robeson and Malcom X and apply that historic context to the recent news of the Iran nuclear deal.
Meet a man who devoted his life to sorting out HOW the brain makes decisions. Turns out, the judgment process for making decisions in the face of any challenge can be divided into either ‘fast’ or ‘slow’ thinking. We explore thought experiments that help explain why you can often be so positive you are correct, just to find out you’re wrong.
- Daniel Kahneman author, "Thinking, Fast and Slow" (Farrar, Strauss, Giroux); also Eugene Higgins Professor of Psychology Emeritus, and Professor of Psychology and Public Affairs Emeritus, The Woodrow Wilson School Princeton University
MR. KOJO NNAMDIFrom WAMU 88.5 at American University in Washington, welcome to "The Kojo Nnamdi Show." It's a marvel of efficient problem-solving. Every day, the human brain makes thousands of decisions, most of them snap judgments, deciding whether the car in front of us is going to swerve into our lane, whether a spouse's voice on the phone means we're in trouble for something. Some involve taking a step back and doing some math, calculating a tip or filling out a health care form.
MR. KOJO NNAMDIBut most of the time, our thinking is rational, except, of course, for when it isn't. Psychologist Daniel Kahneman has spent four decades studying what happens when our judgment fails us, why the brain arrives at irrational and illogical decisions. His work laid the foundation for the field of behavioral economics. It's been used to explain why investors make bad bets on stock, how arbitrary factors can affect sentences handed down by judges and why your kitchen remodeling job always costs much more than you expect.
MR. KOJO NNAMDIDaniel Kahneman joins us in studio. He's a professor of psychology emeritus at Princeton University and author of the book "Thinking, Fast and Slow." He received the 2002 Nobel Prize in Economic Science, the first non-economist to be awarded that prize. Daniel Kahneman, thank you so much for joining us.
PROF. DANIEL KAHNEMANThank you for having me.
NNAMDIThe human brain is an exceedingly complex system. Any given minute, it's responding to millions of data points and trying to make sense of our surroundings. You have called us this book "Thinking, Fast and Slow" because you say the brain has two different systems for that process of making sense of things. Please explain.
KAHNEMANWell, there really are two types of thinking, and it's more appropriate, more accurate to speak of two types than of two systems. But two systems is a very efficient way of expressing it because we -- they develop personalities. System one does the automatic thinking, which is -- and that's, you know, what happens when I say two plus two. The number pops into your head. System two does the effortful thinking.
KAHNEMANSo, you know, when you fill the health form or when I tell you compute 17 times 24, that's -- that calls on system two. And computing it would demand work, and it would have a sense of urgency. That would be something that you do, something that you're the author of, which system one doesn't have. Automatic thinking happens to you.
NNAMDIIn the vast majority of cases, we are rational beings, but your work examines what happens when we get things wrong. Why do we get things wrong?
KAHNEMANWell, you know, we get things wrong because we don't have enough information. Or, sometimes, we have the information, but we don't put it together in efficient ways. So one example is that statistical information, that is, knowledge about categories, is we have it quite often, but we don't apply it, even when it is relevant. So that's an example of misuse of information or lack of use of information that we do have.
KAHNEMANAnd the mind is an engine for jumping to conclusions, so it jumps to conclusions on the basis of very poor information. And it has another feature. When it cannot answer a question, it tends to answer another one, a simple one, so that we're really stumped even when we don't know actually the answer to the question we're trying to answer.
NNAMDIWhen our brain is presented with a complex problem, it begins to tackle the question in different ways, but sometimes, the answer that seems obvious is actually illogical. You have a thought experiment in the book involving a randomly selected guy named Steve.
KAHNEMANYes. Well, Steve is a meek and tidy soul with a passion for detail. And, you know, I could go on, but -- and, you know, that he's been randomly selected from the population. And, now, is he more likely to be a farmer or a librarian? And, of course, the first thing that comes to everybody's mind is that he looks pretty much like a librarian. That's how he was intended to look. But the fact is that there are 20 times as many male farmers as male librarians. So if you put the two together, he's much more likely to be a farmer.
NNAMDIAnd the way my brain worked when I saw that was exactly in the way you described the rise of agribusiness and the disappearance of the farmers, that it's conceivable that there are more male librarians than there are male farmers. And I guess that's the way we use our brains, isn't it?
KAHNEMANWell, you know, for most people, I mean, what you describe is quite sophisticated. For most people, what they will immediately respond to is that a meek, tidy soul, passion for detail, that sounds like a librarian...
KAHNEMAN...and they will not go much further. So they are asked the question about probability, but they answer a question about similarity.
NNAMDIOne of your earliest and most influential works explored biases and something called heuristics. How do these factors influence our thinking?
KAHNEMANWell, what happens is -- as I said, what heuristics are is that you're asked one question, but you answer a simple one instead. And, you know, I can give you examples of the way that works, but...
KAHNEMANWell, here is one. So I'll tell you about a young woman. Her name is Julie. She's graduating from a university. She's a graduating senior. And I'll tell you one fact about her, which is that she read fluently when she was age 4. And now, I ask you, what is her GPA? And the striking thing is that a number came to your mind, and it's less than four and definitely more than 3.2. So, you know, it's around 3.6, 3.7. Everybody thinks that...
NNAMDIIt was 3.8 in my case, actually.
KAHNEMANYeah. Well, it's in that neighborhood.
KAHNEMANAnd what you did -- and we know you did that. You -- it's quite complicated what you did. You evaluated how precocious she was in her reading, and that gave her, in your mind, the equivalent of a percentile score. How extreme is she? And then you match that percentile score, and you translated that into a GPA. And that happened in your associative memory immediately. You were barely conscious of doing it, and an answer popped into your mind.
KAHNEMANIt's a very sophisticated calculation. It's wrong. You -- that is not, you know, what a statistically correct answer would be, but that's the way associations work.
NNAMDIIn case you're just joining us, we're speaking with Daniel Kahneman. He's a professor of psychology emeritus at Princeton University and author of the book "Thinking, Fast and Slow." He received the 2002 Nobel Prize in Economic Science. He was the first non-economist to be awarded that prize. Human beings seem to have a kind of optimism baked into our brains that lead us to expect better outcomes than we will likely get.
NNAMDIOne of the biases you identify in your work that hits close to home is called the planning fallacy. How does that work when, oh, we're thinking about, oh, remodeling our kitchen or our bathroom?
KAHNEMANWell, you know, we think of remodeling our kitchen, so we make a plan. And we get an estimate, and then -- but -- and we think that we are going to spend roughly what the estimate was. But, in fact, what happens is many things happen. Things go wrong, which are going to be more expensive, and then you think of other things that you didn't think of while you were having your plans. They turn out to be very expensive. And, on average, people spend twice as much of they had planned, but twice as much as they had expected.
KAHNEMANSo the planning fallacy is that you expect outcomes to be fairly similar to your plans, and they're not. They tend to be much worse than your plan.
NNAMDIHappens to me every time I go out to buy a car, but that's another story. The planning fallacy leads us to undertake risky behavior without really considering whether it's really worth it or whether it's in our interest, right?
NNAMDIHumans have a tendency to fear losses more than we value gains, and that can lead us to take risks. This is a different kind of bias called loss aversion. When risk is involved, we make decisions to mitigate our losses. You've designed a number of interesting studies to explore and measure how our fear of losses distorts our thinking. Can you give us an example?
KAHNEMANWell, yes. I mean, here's a very simple example. You can ask people -- we're going to play a gamble on the toss of a coin. And if, you know, it shows tails, you lose $100, if it shows heads, you win X dollars. And now, what does X have to be for the gamble to be attractive to you? Well, it turns out, for many people, X has to be more than $200, so you have a gain. You have a loss. They're equally likely. And people put a lot more weight, approximately twice as much weight, on the losses as they do on the gain. And loss aversion is a major facet of our lives.
NNAMDIYou and your late partner, Tversky, also designed a survey question involving the outbreak of an unusual Asian disease which is expected to kill 600 people. Tell us about that.
KAHNEMANWell, that was an experiment on what we call framing. So you have exactly the same problem, and you can describe it in two different ways. So, in that case, there was a...
NNAMDISome responders were asked about...
KAHNEMAN...a choice between saving 600 people for sure or a gamble in which you had, I think it -- was it a one-third probability...
KAHNEMAN...to save 200. No, no, saving 200 people for sure...
KAHNEMAN...or a one-third probability of saving all 600. An alternative way of framing exactly the same problem would be that, in one case, you had two-thirds probability of 600 people dying or 200 -- or 400 people dying for sure. Now, those problems are really identical, but people choose quite differently. They prefer to save the 200 lives, and they hate the idea of 400 people dying. So they take the gamble when the problem is presented as a loss. And that happens quite frequently. People make different decisions depending on how a problem is framed.
NNAMDIIndeed, people have used this concept to explain a whole array of real-world phenomena that seemed to defy logic. Why do investors, for instance, cling to stocks that are declining in value? Why are homeowners reluctant to sell their homes when the market tanks?
KAHNEMANWell, that loss aversion -- one of the manifestations of loss aversion is that you hate to admit a loss. You hate to -- as you would, when you sell a house for less than you paid for it, you know that you lose money. And people are very reluctant to do that. And so that's why the market for houses tends to dry up when prices are going down because you have many people insisting on getting what they think their house was worth, and it was worth at least as much as they paid for it.
NNAMDIOne of the more important early biases and distortions you wrote about is something called the anchoring effect. What is that?
KAHNEMANWell, the anchoring effect, that when you consider a possible solution to a problem, you make that solution plausible in your mind, and that has many consequences. So I'll give you an example. If I ask you, is the tallest redwood more or less than 1,200 feet tall? Now, you know that it's too much, but what I have made you think of is -- I have made you think of very tall trees. If I ask you, is the tallest redwood more or less tall than 100 feet?
KAHNEMANYou know that that's too little, but I have made you think of short trees. Now, if I ask you, what is your best guess? What is the height of the tallest redwood? You are not going to give me the same answer if I ask you the first question or the second. That's the anchoring effect. It's a very large effect.
NNAMDII found it fascinating that in one experiment, German judges were inclined to give a shoplifter a longer sentence if they just rolled a pair of dice loaded to give a high number. That, of course, can be found in the book "Thinking, Fast and Slow." Anchors seem to be very important in negotiations as well. We get back to my buying a car again. If I'm buying a car or negotiating in a market in a developing country, the first number, the first offer thrown out serves as the anchor, and it, therefore, can profoundly influence the terms of your negotiation, correct?
KAHNEMANAbsolutely right. And, you know, that's why real estate agents, you know, when you go to buy a house, they will frequently take you first to see a house that is much more expensive than you said you were intended to buy. And they say, I know this is more than you're planning to buy. I'm just showing you to get you a sense of what money can get. And they get you anchored on that price as a buying price. And it has very big effect on your willingness to pay.
NNAMDIThat is almost exactly what happened to my wife and me when we first went out to buy a house. And then when we eventually saw another house that was a much lower price and my wife called me to tell me about that, my first words were, what's wrong with it?
NNAMDIBecause the anchor price had already prejudiced me in that way. With that, we're going to take a short break. When we come back, we'll continue this conversation with Daniel Kahneman. I'm Kojo Nnamdi.
NNAMDIWelcome back to our conversation with Daniel Kahneman. He's a professor of psychology emeritus at Princeton University and author of the book "Thinking, Fast and Slow." Daniel Kahneman, I'd like to play you a clip from an unlikely public philosopher and cognitive psychologist. Please give a listen.
SECRETARY DONALD RUMSFELDThere are known knowns. There are things we know we know. We also know there are known unknowns. That is to say, we know there are some things we do not know. But there are also unknown unknowns, the ones we don't know we don't know.
NNAMDIThat, of course, the famer quote from former Defense Secretary Donald Rumsfeld, explaining the challenges of preparing for threats to the United States. Some people saw this as an extremely tortured string of logic. But the idea of unknown unknowns really gets down to the core of what you're talking about, doesn't it?
KAHNEMANOh, it's a very good bit of cognitive psychology and one of the few things I like about Rumsfeld, but it's a first-rate statement of a very important problem.
NNAMDIYou write about something called the availability heuristic, something that, sometimes, when we consider a question, our own memory ends up profoundly biasing how we answer that question. For example, here's a very Washington-relevant question: Is adultery more common among politicians than among lawyers or physicians? There are some very compelling arguments that would seem to buttress this idea. I can rattle off a list of politicians who have been caught in affairs, but I can't rattle off all of the local doctors who were.
KAHNEMANSo, you know, actually, I found myself holding that belief. And it's an absurd belief because, of course, there is an enormous bias. The media are very interested in the affairs of politicians, and, by and large, the affairs of physicians and lawyers are of very little interest. So you are going to get many more media reports of one kind of affair and far fewer of the others, and you draw instantaneously.
KAHNEMANYou draw that inference that there are more of one kind because, when asked for a question, like how many affairs do politicians have, we -- our mind goes to how easy is it to come up with examples. And the media makes it quite easy.
NNAMDISpeaking of the media, the availability heuristic seems particularly relevant for us, the media, when it comes to what topics and what news stories we cover. I guess it's one of the more profound but subtle biases that exist in journalism. We only cover issues that are on our radio or our radar screens, or, put another way, there are hundreds of news stories that we might cover if we knew about them.
KAHNEMANAbsolutely right. And, you know, everybody actually is influenced by that. The political influence of that is really quite remarkable because we also tend to judge the importance of issues by how frequently they are mentioned. So when an issue gets sort of repeatedly discussed, it becomes more and more important. And you can take -- politicians are very good at that. They can take almost any problem and make it the center of the universe simply by talking repeatedly about it.
NNAMDIAnd if you happen to be listening to any political debates during this political season, you'll get a wide variety of examples about that. Getting back to the topic of judges for a minute, this July, there was a very interesting report on immigration judges in asylum cases. A non-partisan group out of Syracuse University analyzed the decisions of 265 immigration judges across the country, who have ruled in at least 100 political asylum cases in the last five years.
NNAMDIOn average, judges rejected 53.2 percent of applications. But the report highlighted a few judges who denied requests at 80 and 90 percent levels. What disturbs many observers is just how stark the rates are and the concern that the merits of the individual cases are not the only or not even the primary determining factor in who gets asylum. What do you think?
KAHNEMANWell, you know, you would want to know, of course, whether the judges get samples from the same population of cases. I mean, it's also possible. You know, that's the first thing that I would warn listeners to worry about and to make sure. And then, you know, we know that judges are susceptible to crazy influences. I mean, there is a story coming from Israel about parole judges in Israel and the rate at which they grant parole.
KAHNEMANAnd it's a function of how recently they have eaten. So, immediately after a lunch break, they grant parole about 65 percent of the time. Two hours later, when they're getting hungry, they grant parole to almost no one. And it's a very big, very reliable effect. And they're, of course, completely unaware of it.
NNAMDIDo you find yourself sometimes getting scared by the insights you are able to draw from the research that you've been doing?
KAHNEMANNo. You know, I've been doing this for a long time, and I have few illusions. I find it more amusing than scary. When you think -- it gets very scary when you realize that, from a mental principle, which is that high-stakes decisions are not necessarily made better than small-stakes decisions. I'm not sure that going to war is necessarily made better than planning a kitchen. And we know that planning a kitchen isn't done very well. And I worry about planning wars.
NNAMDIThank you. That's the scary part of it. You're writing about cognitive processes at an individual level. But what kind of relevance do you see, system-wide or economy-wide? Did any of these biases, for instance, contribute to the great recession?
KAHNEMANWell, we know of one bias that certainly contribute to the great recession, and that's on the side of the people who bought the mortgages, who bought mortgages that they couldn't afford. And that we know. I mean, this is optimistic overconfidence. It's a very well-known bias. And it was exploited by predatory lenders. Now -- then there were people who are optimistic and overconfident in the banking industry, but their behavior is more complex because they were also rewarded and paid for being overconfident.
KAHNEMANSo that's different. But, certainly, at the bottom of the pile, the people who bought mortgages that they couldn't afford were clearly biased. And they needed protection, and they didn't have it.
NNAMDIBut I'm glad you also mentioned the people on Wall Street, the people who are supposed to be experts because maybe the people who bought mortgages can be -- I don't know -- forgiven for their own overconfidence. But you say that many experts overestimate their skills also.
KAHNEMANWell, you know, I think many -- the question is how you define experts, you know. And so if anybody who simply has been doing something for a long time thinks that he or she is an expert, then many of them are wrong. You develop expertise if you live in a world that is sufficiently regular to have rules that you can discover, and then, when you discover, dealing with those rules become automatic. That's how you recognize, you know, that the driver in the other lane is likely to be dangerous. You've been driving a long time.
KAHNEMANBut if you live in a world that is largely random, you cannot develop expertise. And so people who pick stocks for a living, they live in a world that is largely random, and they feel that they have expertise. I mean, they're quite sincere in their belief that they can do it, but their belief in certainly the very large majority of cases is wrong.
NNAMDIIt's really luck and chance in a lot of cases.
KAHNEMANIt is luck and chance. And we know the mechanism where people who are actually successful because of luck will feel that they're successful because of their skill. And other people will believe it, too.
NNAMDIWe're talking with Daniel Kahneman. He is a professor of psychology emeritus at Princeton University. We're talking about his latest book "Thinking, Fast and Slow." Daniel Kahneman is the winner of the Nobel Prize in economics in 2002. Some people call it thinking by your gut. Some call it intuition, circumstances when we just feel like we know the answer to a question, whether we can explain it or not. What is intuition?
KAHNEMANWell, intuition is really defined as knowing without knowing why you know. And that describes, most of all, thinking. So, you know, if a little child points to something and says, doggie, that little child knows without knowing why it knows. It knows it has recognized an object. Intuition, in the word of the great psychologists and general intellectual Herbert Simon, intuition is recognition. But we have intuitions that are baseless.
KAHNEMANSo when we answer the wrong problem, as I was describing earlier, when we use heuristics of judgment, when we're anchored, we also have intuitions. We have the same confidence in both as we do when we have expertise, but they happen not to be necessarily correct.
NNAMDIFor most things in our lives, we are actually pretty good at picking up subtle clues and sensing answers for reasons we cannot quite explain. What's going on in my brain when I feel like I know that, well, my wife is angry at me or when I'm sure that, as we pointed out, someone in the driving lane next to me is about to do something reckless? I'm using system one there, are I not -- am I not?
KAHNEMANYes. This would be called a system one operation. It's fast thinking, and it is a result of considerable experience. So, you know, this is what happens to chess players when they can recognize a complex situation and immediately know, you know, what's the outcome, white mates in three or whatever. And it's the same thing that enables you to recognize your wife's mood on the telephone.
NNAMDIAt the same time, many of our most irrational decisions come when we start recklessly using our gut, and, apparently, experts are just as prone to mistakes of cockiness and flawed logic as we are.
KAHNEMANWell, that's certainly -- you know, in some domains of life, people are selected for optimistic overconfidence. Most of the people who have a lot of influence on our lives are self-selected. You know, they are the leaders of organizations. They are the political leaders, and they are there because they can -- they think they can do things. And they sometimes think they can do things although they cannot. And so they're much more prone to optimistic overconfidence than most of us.
NNAMDIYou say that intuition does exist, but it's not quite as much of a magical process or a sixth sense as we tend to think it is?
KAHNEMANI think it's not magical at all. There is no magic about it. There is a lot of practice in the regular world. And with practice in the regular world, you develop good intuitions. And when you live in an irregular world, you have intuitions, and they're not very good. And you can't tell the difference. The tragedy about intuition is that a person cannot really tell the difference between good intuitions and bad ones.
NNAMDIHow do you tell the difference between a regular world and an irregular world?
KAHNEMANWell, you have to ask whether the world has -- you know, chess is the prime example of a regular world. So things happen, and you can predict what will happen next. The stock market -- prices in the stock market are a very good example of a completely irregular world because any information that is available is supposedly embedded in the price.
KAHNEMANAnd so anybody who thinks that he can know something that the market doesn't know is likely to be wrong. And much of life is between these two extremes, but we develop true skills closer to the regular end.
NNAMDIYou're the first non-economist to win a Nobel Prize in economics, but your work on cognition, biases and irrationality has been used in a variety of social sciences and medical applications. You and your academic partner of many years, Amos Tversky, laid this foundation for understanding our biases and blindness at a cognitive level. But unlike many academics in psychology and other research fields, you designed your inquiries around thought experiments and playful questions. How did you settle on this style of inquiry?
KAHNEMANWell, it turned out that it was natural for us because we tended to do a research on each other. You know, we -- we're very good friends. We spent many hours a day together. Our research was we spent a lot of time looking for problems where we knew the answer but where our intuition was incorrect. And if the two of us agreed on our intuition, we sort of were pretty confident that other people would share the same intuition and that we could generate a question that would fool many people.
KAHNEMANSo that was the style of our research. It turns out that, quite by accident, that this is a very effective style of communication because then you have readers who read those examples, and they know that what you say is true of them. They cannot say, well, it's true of, you know, undergraduates or something. It's true of me. I would fall for this. And it turns out that that was, I think, the primary reason that our work was relatively influential.
NNAMDIIt's certainly how it hit a responsive chord with me. We're going to have to take a short break. When we come back, we'll continue our conversation with Daniel Kahneman. His latest book is called "Thinking, Fast and Slow." I'm Kojo Nnamdi.
NNAMDIWe're talking with Daniel Kahneman. He is the 2002 Nobel Prize winner in economic science, the first non-economist to be awarded the prize. Daniel Kahneman is a professor of psychology emeritus at Princeton University and author of "Thinking, Fast and Slow." I'd like to talk a little bit about how these different processes end up distorting our memories and our subsequent decisions and perceptions about life experiences.
NNAMDIWhenever I'm presented with a problem or complex question, my brain draws upon a variety of past lived experiences. You've been particularly interested in how we form and how we retain those memories and whether our recollections of the past end up actually leading us to make irrational decisions.
KAHNEMANWell, it turns out that when people have an experience that is pleasant or unpleasant -- so, you know, it could be a location or it could be a medical procedure -- and then you ask them for an evaluation of what happened during that period, during that interval, we now understand fairly well how they form that overall impression of a past episode. And it's really peculiar because they're very influenced by the peak emotional moments. So in a medical procedure, it would be the most painful moment. And they're very influenced by how they felt at the very end of the experience.
KAHNEMANWe call that the peak-and-end rule, and then there is one variable that seemed to be in -- our memories are completely insensitive to almost completely, and that's the duration of the episode. So a short procedure and a long procedure, both painful, if they have the same peak and the same end, will illicit, essentially, the same global evaluation. Although anyone who looks at it says one of them is much worse than the other because we want painful experiences to be short. That's not how memory works.
NNAMDIWell, your experiments on painful experiences have taken you to, at least, the dignified -- one of the least dignified experiences for people of a certain age, having had that experience myself, the colonoscopy. Please explain.
KAHNEMANWell, indeed. Colonoscopy used to be a painful procedure. Now it's not because it's administered with anesthesia. But when we ran experiments on it, it was a very painful procedure. And Donald Redelmeier, who actually ran that study at a hospital in Toronto, he had patients report every minute on the level of pain they were experiencing at that moment. And then, at the very end, he asked them various questions requiring a general evaluation of what they had just experienced. And the colonoscopies vary in length enormously.
KAHNEMANI mean, he had one that was four minutes long, and he had one that was nearly an hour. But duration had nothing to do with the global evaluations. It was really determined mostly by how bad it was at the peak and how bad it was at the end. And that has an obvious corollary consequence, that if you add -- if you make the colonoscopy longer, but you add to it a bit that's not very painful, then you are going to improve the memory of it by making it actually worse. That is, by adding to the discomfort of people, you will improve their memory. And, indeed, that is the case.
NNAMDIConsider another scenario. I'm a basketball fan. I decide to pay a lot of money to buy floor seats at a Washington Wizards' basketball game for my son and me. The seats are amazing. We have incredible views of the entire arena, and it's all going well until the last two minutes of the fourth quarter at the end of the game when the Washington Wizards cough off a big lead and lose spectacularly.
NNAMDIWhen I'm driving home, am I thinking about the fun that my son and I had for, oh, 46 minutes of that 48-minute game? Or am I thinking about the fact that I wasted money on those seats just to see an awful basketball game?
KAHNEMANI think the latter. And we have a fair amount of evidence that people really evaluate experiences -- you know, if your memory was ruined, you think your experience was ruined. But, in fact, you had had the experience, and it was very pleasant. But if your memory was -- you know, memory are all we get to keep from our experiences. And so if you ruin the memory, you have ruined all you get to keep.
NNAMDIBecause I like the hotdogs. I like the peanuts. I loved how the game was going up until that point. But you are absolutely right. After the end of such games, I said, I'm never buying another ticket to one of these games again. I can't take it anymore. What implications does your research have, this kind of research, for better public policy and better living for individuals?
KAHNEMANWell, you know, there -- for policy, there is increasing interest in basing policy on people's well-being, that one of the objectives of policy should be to improve well-being or to reduce suffering. And it's a very significant question. How you are going to measure and define the suffering or the happiness that you want to influence? And you can define it by satisfaction, by asking people to remember experiences. Or you can define it by studying how they feel during the experience.
KAHNEMANHow you felt during the game or how you felt at the end of the game, that defines -- that gets to very different responses about what should policy be in that. So we have to deal with that question.
NNAMDII'm glad you mentioned the issue of happiness because, in looking at one of the reviews of your book by Jim Holt in The New York Times, one of the things he says is that Daniel Kahneman never grapples philosophically with the nature of rationality. He does, however, supply a fascinating account of what might be taken to be its goal: happiness. Is that the goal of rationality?
KAHNEMANWell, I didn't think of it that way.
NNAMDIThis is Jim Holt's study.
KAHNEMANThis is Jim Holt's way of, you know, transitioning to a new chapter in his review. But certainly, people are interested in having a good life, and how you define the good life is an urgent question. And the research that we have done makes that question more complicated. You know, by allowing it, you define, you know, how you enjoy a game in two very different ways. It shows you that there is a dilemma right at the heart of this problem of defining happiness.
NNAMDIGetting back to issues of public policy and better living for individuals, some of the cognitive mistakes we make as consumers don't come about by accident. Whenever I get a long user agreement for a software or for song downloads or whenever I get a long explanation on what's going on in my mutual fund, I'm always tempted to just kind of glance over and lock into system one. But I get a sense that these contracts are designed specifically to get me to do that, to get me not to ask tough questions.
KAHNEMANWell, they are. I mean, there's no question about it. I think one of the truly perverse consequences of the assumption of rationality that was widely hailed in economics, and is still widely held in economics, is that consumers do not need protection beyond guaranteeing that the truth is made available to them so that, you know, it is essential and it is illegal to lie to people. And sometimes it is legally and forcible to tell them the truth. But a rational individual is assumed to be able to read the small print.
KAHNEMANBut humans actually don't read the small print. And, of course, the firms who designed those contracts know that the small print will not be read, and there is an irresistible temptation to take advantage of it. And quite a few of them do take advantage of it.
NNAMDIAt a public policy level, we've seen a variety of interesting examples of attempts to simplify. People want to simplify the tax code, to force health care plans to describe benefits in plain English. Is that an effective way to avoid society-wide irrationality?
KAHNEMANWell, this is a, you know, very grand statement. I think it's an important way to improve the way things go in society, you know, and that, I would say, is one of the more important overall consequences of the work that behavioral economists and people like me, you know, have been doing for the last 20 or 30 years, is that there are things that can be improved in society. They're not necessarily the big things, but there are small things that can be improved. And that will have significant consequences.
NNAMDILet's talk a little bit about why you wrote this book. A number of very successful books have been written for popular audiences taking insights from your research and purporting to explain the world or to help people improve their lives. I'd like -- I'd be interested to knowing what you think of self-help books.
KAHNEMANWell, I am a pessimist about self-help. I don't think that telling people about mistakes will, by itself, suffice to help them avoid it. And, you know, the reason I'm so pessimistic is, of course, I've been studying this topic for 45 years, and my intuitions are no better than they were. I mean, so I know that that doesn't work. My objective was, in the first place, to describe what's going on.
KAHNEMANBut I also think that educating gossip is a worthwhile objective, and that we are pretty good at detecting the mistakes of other people much better than at detecting our own. And by providing a richer and subtler vocabulary that allows people to discuss the mistakes of others more intelligently, we might, ultimately, improve everybody's decision-making because everybody always anticipates the gossip of other people. And anticipating intelligent gossip is -- leads you to better decisions than anticipating stupid gossip.
NNAMDIA lot of these snap judgments that we make are baked into our brain's wiring that have evolutionary explanations, correct? Our species has been able to survive because we can detect dangers based on subtle cues that we may not be fully conscious of. What is the significance of your findings from an evolutionary perspective?
KAHNEMANWell, you know, there is a debate, and you can -- there is a school among people who deal with evolution that assumes that just about anything that you observe is optimal and, you know, has been designed and honed by evolution until it is perfect. And for that school, for people who have that belief, then whenever you find anything odd in people's apparatus, they ask, you know, what makes this good? What makes this useful?
KAHNEMANI don't share that belief. I think, you know, much of it -- you have machinery that was designed by evolution to do one thing, and then there are side effects of it solving one problem. And the side effects may make it more difficult to solve another less important problem, and I think this is the way that our mind evolved.
NNAMDIWe're talking with Daniel Kahneman about his latest book. It's called "Thinking, Fast and Slow." Daniel Kahneman received the 2002 Nobel Prize in economic science. He was the first non-economist to be awarded that prize. When you first began developing these ideas, most researchers made a couple of base assumptions about human beings and about our decision-making. It was generally believed that we were rational beings, that we made decisions based on logic and basic math skills and that we only acted irrationally when we acted out of emotions like anger or humiliation.
NNAMDIBut you began to question whether our brains were actually as rational as we thought. In economics, people often refer to this mythical rational species as Homo economicus. You've actually never taken a formal economics class, but you're credited with disproving that Homo economicus ever lived.
KAHNEMANWell, you know, we didn't start with Homo economicus in mind. We actually started from visual illusions. So our visual system is nearly perfect. I mean, we're really very well-designed, and, you know, it has a long evolutionary history. But we still are susceptible to illusions. And when you analyze why we're susceptible to illusions, it's the same process that generates accurate vision most of the time also generates illusions.
KAHNEMANAnd that was our approach. We were looking for the mechanisms that generate cognitive illusions and that also, most of the time, generate correct thinking. Homo economicus and the idea of rationality is a, you know, very extreme view that comes from decision theory and, you know, assumed that people, all their beliefs and all their preferences are internally consistent, which is sort of completely unrealistic. And it's true that our subsequent research focused on disproving that.
NNAMDIWhen you talk about your subsequent research and the point at which you started out from, were there, along the way, surprises in the research that you were conducting that made you reflect about what you thought originally about so many of these things?
KAHNEMANWell, you know, it was really a series of surprises because we were exploring judgment first, and we really didn't anticipate what we were going to find. It was a very pleasant process of exploring what was going on in our own minds. And then, when we switched to study decision-making, we went through the same process. We would made up -- make up questions and see how we answered them and then looking for patterns in our own answers, and most of it was surprising.
NNAMDISpeaking of patterns, what are the links between the processes going on in our brain and our physical movements and twitches?
KAHNEMANWell, the links are very close, and that's been studied extensively in recent years. So, you know, any change, any thinking that you do, any effort that you do changes your heart rate, changes your pupil size. You know, your pupils dilate whenever you think hard. And, in addition, when you are just shown a word that you don't like, then you recoil imperceptibly. You're not aware of doing it, but you can measure it. And you recoil.
KAHNEMANAnd then there are very tight links between our emotions and our facial expressions. And when you manipulate people's faces, it changes their emotions to match, you know, the emotion...
NNAMDISo when I frown, my emotion changes to match.
KAHNEMANWhen you frown, you become more serious. When you frown, system two is activated. You tend to pay more attention. You tend to lose touch with your intuitions. Many things happen to you when you frown.
NNAMDIBecause I've noticed if I happen to be speaking before a class of students, and I see a student frowning, it sends a message to me that the student is focusing and concentrating and analyzing more what I'm saying. You just explained it. We know that Americans are profoundly unhappy with their government and their elected leaders right now, especially their inability to come to agreement about cutting spending and raising taxes.
NNAMDII bring this up because it occurs to me that almost all of the claims being made are really emotional appeals. The numbers themselves are so mind-bogglingly huge that we can't really use our type two side of the brain because it's so difficult to differentiate the difference between $1 billion here and $1 trillion there.
KAHNEMANWell, certainly. I mean, you know, politics is mostly system one. It is -- you know, it's an appeal to emotions. And we actually know that voting is an emotional act. And we trust certain leaders, and we distrust others. The world is just too complicated for citizens to understand it, and, ultimately, they have to trust someone or another. Or, in some cases, they don't find anybody they can trust, and then they are very unhappy with the system.
NNAMDISo we don't necessarily wrap our heads around those billions and trillions. We operate on the basis of trust. Daniel Kahneman is a professor of psychology emeritus at Princeton University and author of "Thinking, Fast and Slow." He received the 2002 Nobel Prize in economic science. He's the first non-economist to be awarded that prize. But before I go, I've got to share this great quote with the members of our audience.
NNAMDI"You need to have studied economics for many years before you'd be surprised by my research. It didn't shock my mother at all." There's so much here in "Thinking, Fast and Slow." I only wish we had another hour to talk about it, but I'm afraid we're out of time. Thank you so much for joining us.
KAHNEMANThank you for having me.
NNAMDIAnd thank you all for listening. "The Kojo Nnamdi Show" is produced by Brendan Sweeney, Michael Martinez, Ingalisa Schrobsdorff and Tayla Burney, with help from Kathy Goldgeier and Elizabeth Weinstein. Diane Vogel is the managing producer. Thank you all for listening. I'm Kojo Nnamdi.
Most Recent Shows
Police in Fairfax County, Va., are about to meet with a committee tasked with investigating law enforcement accountability in the wake of a high-profile officer shooting. The committee recently released a report calling for immediate changes at the department, which is also taking heat about the transparency of a recent investigation into the death of inmate at the county jail who was tased. We explore new developments in the local debate over police accountability.
Teaching children and adolescents about 'the birds and the bees' isn't always easy for parents and educators. But a growing body of anecdotal and quantifiable evidence indicates that starting age-appropriate sex education early can go a long way toward preventing assault later. We consider the benefits of - and hurdles to - getting teachers, students, parents and administrators comfortable talking about sex.
D.C. Council Member Kenyan McDuffie (D-Ward 5) and Montgomery County Executive Ike Leggett join Kojo and Tom Sherwood in the studio.