In 1971, Daniel Kahneman and Amos Tversky, psychology professors at the Hebrew University of Jerusalem at the time, began a sabbatical year at the Oregon Research Institute. The two Israelis, both in their 30s, seemed like a study in contrasts; where Tversky, a decorated paratrooper with shrapnel lodged in his body, was optimistic and analytical, Kahneman was pessimistic and intuitive. But they shared a sense of humor, and an interest in the psychology of mistakes.
That year they ran dozens of experiments. In one, they built a wheel marked 0 to 100, but rigged it to stop on only 10 or 65. After each spin, the subject wrote down the number and was then asked to guess the percentage of countries in the United Nations that are African. On average, those who spun a 10 guessed 25 percent, while those who spun a 65 guessed 45 percent.
The number on the wheel, though arbitrary, unconsciously swayed people’s predictions—hence the phenomenon is known as anchoring. It happens everywhere. For instance, a sale on cans of tuna that limits each customer to 12 causes the average shopper to buy twice as many cans (seven) than if there were no limit. People also anchor on ideas, sometimes with serious consequences. Recent studies indicate that physicians can fixate on an initial but ultimately misleading symptom, jump to conclusions, and fail to make an accurate diagnosis.
Kahneman and Tversky became connoisseurs of such cognitive biases, meticulously cataloging the ways in which human thinking is flawed.
Related Content
Graphic
The Reach of ‘Prospect Theory’
Thousands of citation records track the popularity of Daniel Kahneman and Amos Tversky’s seminal 1979 paper. View a Chronicle graphic to see how prospect theory has spread.
Beneath the laboratory curiosities lurked an explosive idea. In the 1970s—and still today, though to a lesser extent—two beliefs held sway in the social sciences. First, that people are generally rational and have sound judgment. Second, that when they depart from rationality, it’s a temporary aberration, resulting from emotions like fear, hatred, and love. Kahneman and Tversky’s research suggested an entirely different view: that it is the very way we think—our use of what they called heuristics, or mental shortcuts—that leads us astray.
In 1974 they published their findings in Science. “In general,” they wrote, “these heuristics are quite useful, but sometimes they lead to severe and systematic errors.” That might not sound like the opening shot of a revolution, but as Mark Kelman, a professor of law at Stanford University, puts it: “This was reconceptualize-the-world-type stuff.”
Five years later, Kahneman and Tversky did it again, this time upending conventional wisdom about economic behavior. Assumptions about rationality and selfish profit-seeking are built into utility theory, the dominant model in economics, which holds that people will always act in their own best interests. But for Kahneman and Tversky, it was self-evident that people are neither fully rational nor completely selfish. Their “Prospect Theory: an Analysis of Decision Under Risk,” published in Econometrica, exposed flaws in utility theory by pointing out how it fails to capture the way people actually behave: We are easily influenced by frames and anchors; we’re overconfident; we fear losses more than we value gains. Prospect theory, they argued in 29 equation-packed pages, provides a more psychologically realistic model of economic behavior. (The name itself, “prospect theory,” is meaningless. Kahneman and Tversky wanted something distinctive and easy to remember.)
“Going back to Adam Smith, everyone knew that the idea that people operate optimally is a simplification,” says Eric Wanner, president of the Russell Sage Foundation and an early enthusiast of Kahneman and Tversky’s work. “But until prospect theory, nobody had pinned down the psychology well enough to do anything about it.” Richard Thaler, a professor of economics at the University of Chicago, has an earthier explanation of prospect theory’s impact: “Rationality was f***ed.”
Perhaps, but it didn’t feel that way to most people at the time. Kahneman and Tversky (who died in 1996) had early converts among some junior professors and insurgent types, but their thinking was at first far from the mainstream. Today, however, their ideas have rippled across the scholarly landscape, from economics to engineering, medicine to environmental studies. Their Science and Econometrica papers are among the two most cited in all of social science. According to the Thomson Reuters Web of Science, Kahneman has appeared or been cited in scholarly journals more than 28,312 times since 1979. In 2002 he won the Nobel in economic science for “having integrated insights from psychological research into economic science.”
Kahneman’s career tells the story of how an idea can germinate, find far-flung disciples, and eventually reshape entire disciplines. Among scholars who do citation analysis, he is an anomaly. “When you look at how many areas of social science he’s put his fingers in, it’s just ridiculous,” says Jevin West, a postdoctoral researcher at the University of Washington, who has helped develop an algorithm for tracing the spread of ideas among disciplines. “Very rarely do you see someone with that amount of influence.”
But intellectual influence is tricky to define. Is it a matter of citations? Awards? Prestigious professorships? Book sales? A seat at Charlie Rose’s table? West suggests something else, something more compelling: “Kahneman’s career shows that intellectual influence is the ability to dissolve disciplinary boundaries.”
You don’t glean much about how he did that from his new memoir, Thinking, Fast and Slow (Farrar, Straus and Giroux). The book’s scope is wide—Kahneman, 77, revisits his entire body of scholarship, including the research on judgment and bias he did with Tversky, as well as his later work on happiness—but his focus is on the science, not himself. (Kahneman was unable to comment for this article, because of an arrangement with another publication.) For a clearer sense of his stature, turn to the blurbs. “Among the most influential psychologists in history,” says Steven Pinker. “One of the greatest psychologists and deepest thinkers of our time,” says Daniel Gilbert. Nassim Nicholas Taleb declares Thinking, Fast and Slow “a landmark book in social thought, in the same league as Adam Smith’s The Wealth of Nations.”
That’s not book-flacking hyperbole. (OK, maybe a little.) Ask around and you hear pretty much the same thing. “Kahneman is the most influential psychologist since Sigmund Freud,” says Christopher Chabris, a professor of psychology at Union College, in New York. “No one else has had such a broad impact on so many fields.”
Born in Tel Aviv in 1934, the son of Lithuanian Jews, Kahneman spent his boyhood in Paris, where the family prospered until Germany invaded, in 1940. Precocious and math-minded, the 6-year-old decided to sketch a graph of the family’s fortune: The curve dipped into negative territory.
Jews in France were placed under curfew and required to wear a Star of David. One evening, when Kahneman was no more than 7, he accidentally stayed late at a friend’s house. Before starting the few blocks’ walk home, he turned his sweater inside out. An SS soldier approached. “I was terrified that he would notice the star inside my sweater,” Kahneman recalled years later. Instead, the black-uniformed Nazi gave him a hug and showed him a photograph of his own son. The cognitive dissonance made a great impression on Kahneman: How was this soldier simultaneously capable of great cruelty and great affection?
After Kahneman’s father was arrested in a roundup of Jews—his employer, a chemical company, somehow negotiated his release—the family fled, first to the Riviera and then to the center of France. In 1944, Kahneman’s father died from untreated diabetes. The rest of the family survived the war and returned to Palestine. In an interview a few years ago, Kahneman was asked about his wartime experience. He said simply, “I was luckier than most of the children of my generation in that place in the world.”
At Hebrew University, Kahneman studied psychology and math, earning a bachelor’s degree in two years. In 1955, he joined the psychological-research unit of the Israeli military. Just 21, he found himself the best-trained psychologist in the young army. He was assigned to assess the psychological fitness and leadership abilities of new recruits. Mostly he watched as soldiers completed group challenges like trying to cross a six-foot-high wall using nothing but a log that couldn’t touch either the ground or the wall. Kahneman made note of who took charge and who was a quitter, and was confident in his evaluations.
That confidence was misplaced. Every few months, a commander would report to him about each soldier’s actual performance. It was always the same story: Kahneman’s evaluation had been about as accurate as a blind guess. He noticed something else as well: He was incapable of acknowledging the full extent of his own ignorance. He didn’t doubt the evidence, but he remained confident in his predictions.
Decades later, Kahneman coined a phrase for this cognitive fallacy—the illusion of validity—and applied it to the psychology of Wall Street. Fifty years of research is conclusive, he argues in Thinking, Fast and Slow: Picking stocks is a game of luck, not skill. And yet the illusion of expertise persists in the financial world—and not only there. We are all masters of self-deception, he suggests, blithely ignorant of our own ignorance.
By the mid-60s, Kahneman had joined the faculty at Hebrew University. One day Amos Tversky, a colleague, argued in a guest lecture in Kahneman’s class that people are generally good intuitive statisticians. Kahneman was skeptical, having already been sensitized to his own cognitive limitations. Their debate was lively, and they decided to collaborate on a study of intuition and expertise.
Their first paper was published in Psychological Bulletin in 1971, shortly before they arrived in Oregon. It confirmed what Kahneman had suspected: Even the brains of professional statisticians are not well suited to think statistically. To determine the lead author, he and Tversky flipped a coin. Thereafter they alternated. Over the next 12 years, their research forever changed the way people think about thinking.
“When I met Danny and Amos, neither of them knew any economics,” says Richard Thaler. “They couldn’t have passed Econ 101.” Thaler, sharp-witted and talkative, is seated in his glass-walled corner office at the University of Chicago’s Booth School of Business. He props his feet on the cluttered desk, clasps his hands behind his head, and takes me back to 1976.
Thaler was then an untenured assistant professor at the University of Rochester with an unusual hobby: He collected examples of people behaving at odds with utility theory. For instance, he had a wine-collecting colleague who paid $35 for a bottle but refused to sell it for less than $100. Utility theory couldn’t explain the large disparity between those prices. Thaler called these cases anomalies and tacked a list of them to his office wall.
One day a package arrived from an acquaintance. Inside were several papers, including Kahneman and Tversky’s 1974 Science article on heuristics and biases. Thaler was enthralled. He tracked down an early draft of their essay on prospect theory—in which a key idea is that losses are more acutely felt than gains. Put another way: The pain of giving up a bottle of wine you own and value can be greater than the pleasure of getting an equally good bottle.
For Thaler, it was an aha! moment. “I was no longer the only crazy person in the world. There were at least two other equally crazy people,” he says, grinning broadly. “Even more, they were well regarded in their field, which I was not.” Kahneman and Tversky were then at the Center for Advanced Studies at Stanford University. In 1977, Thaler went to Palo Alto and stayed for 15 months. Behavioral economics had its origin story.
The once-marginal field is now booming. Consider that the top five economics journals rejected Thaler’s first paper on anomalous behavior (it was finally published in 1980 by the Journal of Economic Behavior and Organization). Today he is rumored to be on the shortlist for his own Nobel. What accounts for this sea change? How did an idea—integrating psychology into economics—become a movement?
Part of the answer can be traced to Eric Wanner. Back in the mid-1970s, he edited Harvard University Press’s series on cognitive science. Kahneman and Tversky were on the advisory board, and Wanner heard the buzz about prospect theory. In 1982 he left the press to join the Alfred P. Sloan Foundation, where he tried to bring economists and psychologists together to research the market implications of nonrational decision making. Kahneman and Tversky were at first skeptical, convinced that interdisciplinary work couldn’t be coerced. They suggested instead that Wanner get behind the few economists then willing to listen. Sloan’s first grant in that area, in 1983, paid for Thaler to spend a sabbatical year with Kahneman, who was then at the University of British Columbia. “That’s when behavioral economics really crystallized in my mind,” Thaler says.
A few years later, Wanner became president of the Russell Sage Foundation, which since 1986 has put $8.3-million into behavioral economics. “These are not princely sums,” Wanner says, but the money has been well spent. In 1994 the foundation established a biannual summer camp for budding behavioral economists. The two-week workshop for some 30 advanced graduate students and junior faculty was Kahneman’s idea. Among the graduates are several leading lights of the field, including David Laibson and Sendhil Mullainathan, of Harvard, and Terrance Odean, of the University of California at Berkeley. (Mullainathan, who received a MacArthur Foundation “genius award” in 2002, was recently appointed to lead the new Consumer Financial Protection Bureau’s Office of Research.) “Dollar for dollar, says Colin Camerer, a professor of economics at the California Institute of Technology, “it’s the best social-science investment any foundation has ever made.”
As Kahneman and Tversky’s ideas hopped from discipline to discipline—by the early 1980s, prospect theory had spilled over into medicine, law, and political science—the pattern repeated itself: An enterprising, unorthodox scholar from outside of psychology would fall into their orbit and extend their ideas in new directions. The story of how this happened in medicine is representative.
Donald Redelmeier began his residency at the Stanford University Medical Center in the 80s and became a student of Tversky’s, who had joined the university’s faculty in 1978. “The brightest person I ever met,” Redelmeier says by phone from his office at the University of Toronto, where he is a physician and researcher. In a number of papers he wrote independently with Kahneman and Tversky, Redelmeier—called the “leading debunker of preconceived notions in the medical world” by The New York Times—explored doctor-and-patient decision making and the psychology of pain. He even put to rest the belief that arthritis symptoms are exacerbated by inclement weather. (Redelmeier and Tversky chalked that myth up to people’s tendency to look for patterns even where none exist.)
“Danny and Amos didn’t always see the medical connections, but they had a tremendous receptivity to people outside their domain of expertise,” says Redelmeier. “When they spoke about decision sciences, I was all ears; when I spoke about medicine, they shut up and listened.”
Framing—the way information is presented—is the most salient example of how a cognitive bias identified by Kahneman and Tversky can affect medical decision making. In a classic study done by Tversky and colleagues at Harvard Medical School, physicians were given two options to treat a patient with cancer: surgery or radiation. The five-year survival rate favored surgery, but the short-term risks were higher. Half the doctors in the study were told that the one-month survival rate was 90 percent, while the other half were told that there was a 10-percent mortality rate in the first month. The odds were the same, of course, but the doctors responses’ were markedly different. Those told the survival rate were much more likely to choose surgery (84 percent) than those who were given the mortality rate (50 percent).
Among the medical experts who have taken note of such findings is Jerome Groopman, an oncologist at Harvard Medical School and author, with Pamela Hartzband, of Your Medical Mind: How to Decide What Is Right for You (Penguin Press, 2011). “Rational-decision analysis is so far from a doctor’s reality,” he says, adding that the typical consultation lasts only eight to 10 minutes. In that time, doctors must rely on intuition and pattern recognition—this symptom suggests that ailment—to reach a diagnosis. About 80 percent of the time, he says, intuition gets it right. But in the other cases, the patient is misdiagnosed or the diagnosis is delayed.
Groopman believes that heuristics and biases are often to blame. “Intuition is powerful and necessary,” he says, “but if you just rely on that, you’re going to get it wrong.” According to Hartzband, that message is getting through to her students. “I routinely hear them using terms like anchoring,” she says, adding that Kahneman and Tversky “have definitely percolated through the ranks.”
Four decades after he and Tversky first cleared the way for a new understanding of the mind, Kahneman and his ideas have branched off in a dizzying array of directions. How to explain his influence? Most everyone agrees that his scholarship—especially the work with Tversky from 1971 to 1983—is just exceptionally good. Moreover, their insights are relatively easy to digest and pack a lot of explanatory power. And because they shine a light on the very stuff of thought, their ideas are relevant to just about everything.
Political scientists use prospect theory to model foreign-policy decision making. Some international-relations scholars argue that cognitive biases favor hawkish policies, making wars more likely to begin and more difficult to end. (Kahneman shares that view.) At Columbia University, an interdisciplinary group of economists, psychologists, and anthropologists is building on Kahneman’s ideas about risk perception to better understand apathy about climate change. Kahneman’s services are also, not surprisingly, in demand on Wall Street. Guggenheim Partners, a New York-based global financial-services firm that manages more than $125-billion in assets, has recently advertised a Kahneman-designed “proprietary approach” to help “high-net-worth investors understand their specific attitudes toward risk.”
It may be in the policy world where Kahneman’s ideas have gained the most recent attention and may have the most impact. In the late 1990s, a movement in behavioral law and economics emerged to challenge the assumption in conventional law and economics that judges, jurors, criminals, and consumers are rational. That school of thought, which emerged in the 70s and is most closely associated with Richard Posner, is seen as a bulwark of free-market libertarianism. If people make good choices, the thinking goes, government need only get out of their way. Critics were at a disadvantage, says Thaler. They had misgivings and arguments, but no competing theory of economic behavior. “Then Kahneman and Tversky came along,” he says. “People who felt like they were being bullied now had something to hit back with.”
Much of this hitting back has been done by Kahneman’s friend and collaborator Cass R. Sunstein, the Harvard Law professor who now serves as head of the White House Office of Information and Regulatory Affairs. In 1998, Sunstein and Thaler, along with Christine Jolls, of Yale Law School, published a highly influential article—"A Behavioral Approach to Law and Economics"—in the Stanford Law Review. They called on legal scholars to adopt a more realistic view of human nature. In 2008, Sunstein and Thaler built on those ideas in Nudge: Improving Decisions About Health, Wealth, and Happiness (Yale University Press), which drew from Kahneman and Tversky to design noncoercive policies that encourage people to save more, eat better, and become smarter investors.
For example, 401(k) programs are generally opt-in, meaning that the onus to join is on the employee. Many of us want to, and doing so is certainly in our self-interest, but we’re human: We procrastinate, we forget. Sunstein and Thaler proposed switching 401(k) programs to automatic enrollment. Studies show how doing so increases employee participation. Moreover, because there is still an opt-out, people aren’t forced to join against their will. Kahneman calls Nudge the “bible of behavioral economics.” Interest in these ideas has spread across the Atlantic. The British government has established something called a Behavioural Insight Team to bring principles from behavioral economics to bear on public policy. (Thaler is an adviser.)
It now seems inevitable that Kahneman, who made his reputation by ignoring or defying conventional wisdom, is about to be anointed the intellectual guru of our economically irrational times. For proof, look no further than the newsstand. In the December issue of Vanity Fair, Michael Lewis profiles Kahneman, who is described on the cover as the “brilliant but quirky professor who made Moneyball possible.” Rumor has it that the article is a preview of Lewis’s sure-to-be best-selling next book. Will Aaron Sorkin write the movie script? Will Brad Pitt star? Will Kahneman fall victim to his own illusion of expertise?
That’s unlikely. Near the end of Thinking, Fast and Slow, he insists that his deep understanding of bias and blunder has not made him immune to either failing. “Except for some effects that I attribute mostly to age, my intuitive thinking is just as prone to overconfidence, extreme predictions, and the planning fallacy"—making excessively optimistic estimates of how long it will take to complete a project—"as it was before I made a study of these issues,” he writes.
But Kahneman, it seems, has indeed learned something about the limits of intuitive thinking. After all, what could be more counterintuitive than a humble guru?
Evan R. Goldstein is managing editor of The Chronicle Review.