“I’m very pessimistic about self-help,” said Nobel-winning behavioral psychologist Daniel Kahneman to a questioner at Seattle’s Town Hall last week. He’d been delivering a talk on “fast and slow thinking,” which contrasted the non-conscious pattern recognition of intuition with the more laborious process of reasoning.
The questioner wanted some tips on how to circumvent flawed intuitive leaps, but Kahneman’s position was that, if he himself is any model, knowing about intuition’s shortcomings doesn’t grant you any special ability to turn it on and off. (I would add that depending on how intently you approach Buddhist practice, your mileage may vary, if not in precluding the intuitive leap, then at least in reflecting before acting upon it. But this requires you to work through several meditation cushions.)
Well, what is the problem with intuition, anyway? In Thinking, Fast and Slow, Kahneman lays out the ways in which intuition’s pattern recognition is immensely capable–at the talk he referenced a few “miraculous” cases where a non-conscious expertise, gained from repeated experience, allowed quick thinking to save the day.
A medical professional, for instance, see subtle signs of a heart-attack coming before it hits and calls for an ambulance. A firefighter’s hot feet tell him to clear the room before he can explain why he’s alarmed.
“I hope I have made it clear,” Kahneman added later, “that there is nothing magical about this.” (Magic, he added, is the pernicious belief that the universe is ultimately knowable, can be “clearly seen.”) It’s just that we perceive, and associatively link, much more than we’re ever aware of. Over time, repeated linkages gain in relevance, to use search terminology, still without our being cognizant of it. But our ability to form hunches that pay off becomes stronger and stronger.
The best question you can ask of your intuition about something is whether the environment is regular enough to have taught you this lesson–“Quite often it just isn’t.”
The classic case is your stock market hunch. The market’s behavior is incredibly complex, defeating supercomputer attempts to find regular behaviors over its recorded history, let alone you, with your lifespan-limited range of experiences. (Now, of course, the trading market is in its way self-aware, knowing within milliseconds of moves large and small, and displaying exactly the kind of frightening reactivity that we are prone to as well.)
Part of the urge to hack our intuition comes from how we conceptualize it. To allow an audience to grasp the dynamic easily, Kahneman laid out System 1 (intuition) and System 2 (reasoning). It’s a “psychodrama with two fictitious characters.” System 1 is always “whispering” hints to System 2, and System 2, being lazy, prefers simply to endorse System 1’s input most of the time, and generate some bullshit explanation after the fact, if pressed.
When Kahneman surprised everyone with a slide that said simply “banana vomit,” he pointed out that in addition to experiencing a measurable physiological reaction of disgust simply to seeing the word “vomit,” that we also likely made up a story about bad bananas on the spot to explain the two works appearing next to each other. (Fun fact: When you make people appear to smile by asking them to place a pencil between their teeth, they find cartoons funnier.)
One problem is that creating some patterns excludes other patterns, in the same way that you can’t see the vase and face simultaneously. System 1 suppresses what doesn’t fit the pattern it’s settled on, and so you’re never aware of that missing, conflicting data. Interestingly, not only is intuition on autopilot, but so seems to be our response to it; we respond to people with gut feelings. We glare in exasperation as Spock weighs pros and cons.
Intuition isn’t about accuracy so much as it is about taking action. Sure, we work from terribly small sample sizes, but on the other hand, we have only to see the tiger jump out of the bushes that one time to jump into the next county the next time we hear rustling there. To promote action, rather than lengthy debate, gut-based hunches come with a healthy heaping of self-confidence and a sense of impenetrable coherence.
It’s a little like William James‘s hypothesis that emotion is based on physiological response, rather than the reverse. The hair on the back of our neck goes up, and we describe a feeling of fear. Forming a judgment results in a feeling of confidence in it. We say we have “settled” on something, and in actual fact, we resist being moved off that position. Further details, especially if they don’t easily fit within our schema, are rejected so that we can continue to enjoy the way everything fits so neatly together.
When faced with a truly difficult problem, Kahneman said, intuition will try to answer an easier one instead. In the Israeli Army, he was tasked with testing candidates for officer training school, and had to report on whether candidates displayed leadership qualities in a stressful problem-solving situation. When the results were in, “our predictions were worthless,” he recounted. They were using one-time, situation-specific performance as a proxy for leadership skills, without having proven there was a strong correlation between the two.
(Equally, we are prone to making after-the-fact predictions. “I knew that was going to happen,” we will exclaim, when what we mean is that it was among the outcomes we had entertained. Thinking something will happen and seeing it happen is not a prediction; that’s chance. Even explaining why something has to happen, and watching that exact thing happen is a question of percentages, as Nassim Taleb will demonstrate for you.)
While intuition is flawed, there’s not much easily to be done about it. Remember that the “systems” are fictitious–in reality the same brain’s function generates both. Trying to “correct” intuition’s real-time performance could affect brain function in a way that would change the performance of System 2. Besides, intuition is very good at getting us through our day without having to reason our way past every bit of news we encounter.
Kahneman suggested that we really only need to revisit the most significant decisions, and check for evidence of systematic bias. Because we are the biased system, in most cases, self-help isn’t help at all. You get more mileage from asking other people, whom you haven’t screened to share your biases, what they think of your decision.
“Let other people correct you,” advised Kahneman, knowing that is exactly what many people are least fond of. But as proof of the soundness of his advice, he pointed out how much better he was at detecting when other people were making mistakes, than when he was.
For more on this subject, Kahneman suggests you read Steven Pinker’s How the Mind Works, watch Moneyball on the confrontation between scouts and statistics, and, of course, read Thinking, Fast and Slow.