March 20, 2006
58, Number 23
March 20 , 2006
Westen: Partisan brains can keep politics, facts separate
BY Beverly Clark
When it comes to forming opinions and making judgments on hot political issues, members of both political parties don’t let facts get in the way of their decision-making, according to a recent Emory study. The research may shed light on why staunch Democrats and Republicans can hear the same information but walk away with opposite conclusions.
Investigators used functional magnetic resonance imaging (fMRI) to study a sample of committed Democrats and Republicans during the three months prior to the 2004 presidential election. Participants from both parties were given a reasoning task, in which they were asked to evaluate threatening information about their own candidate. During the task, the subjects underwent fMRI to see what parts of their brain were active. What the researchers found was striking.
“We did not see any increased activation of the parts of the brain normally engaged during reasoning,” said Drew Westen, director of clinical psychology, who led the study. “What we saw instead was a network of emotion circuits lighting up, including circuits hypothesized to be involved in regulating emotion and circuits known to be involved in resolving conflicts.”
Once partisans had come to completely biased conclusions—essentially finding ways to ignore information that could not be rationally discounted—not only did circuits that mediate negative emotions like sadness and disgust discontinue, but new activity was observed in circuits involved in reward, similar to what addicts receive when they get their fix, Westen explains.
“None of the circuits involved in conscious reasoning were particularly engaged,” he said. “Essentially, it appears as if partisans twirl the cognitive kaleidoscope until they get the conclusions they want, and then they get massively reinforced for it, with the elimination of negative emotional states and activation of positive ones.”
During the study, the partisans were given 18 sets of stimuli, six each regarding President George W. Bush, Sen. John Kerry (Bush’s opponent in 2004) and politically neutral male control figures (such as actor Tom Hanks). For each set of stimuli, partisans first read a statement from the target (Bush or Kerry). The first statement was followed by a second statement that documented a clear contradiction between the target’s words and deeds, generally suggesting that the candidate was dishonest or pandering.
Next, partisans were asked to consider the discrepancy, and then to rate the extent to which the person’s words and deeds were contradictory. Finally, they were presented with an exculpatory statement that might explain away the apparent contradiction, and asked to reconsider and again rate the extent to which the target’s words and deeds were contradictory.
Behavioral data showed a pattern of emotionally biased reasoning. Partisans denied obvious contradictions for their own candidate that they had no difficulty detecting in the opposing candidate. Importantly, in both their behavioral and neural responses, Republicans and Democrats did not differ in how they responded to contradictions for the neutral control targets such as Hanks, but Democrats responded to Kerry as Republicans responded to Bush.
While reasoning about apparent contradictions for their own candidate, partisans showed activations throughout the orbital frontal cortex, indicating emotional processing and presumably emotion regulation strategies. There also were activations in areas of the brain associated with the experience of unpleasant emotions, the processing of emotion and conflict, and judgments of forgiveness and moral accountability.
Notably absent were any increases in activation of the dorsolateral prefrontal cortex, the part of the brain most associated with reasoning (as well as conscious efforts to suppress emotion). The finding suggests that the emotion-driven processes that lead to biased judgments likely occur outside of awareness, and are distinct from normal reasoning processes when emotion is not so heavily engaged, Westen said.
The investigators hypothesize that emotionally biased reasoning leads to the “stamping in” or reinforcement of a defensive belief, associating the participant’s revisionist account of the data with positive emotion or relief and elimination of distress. “The result is that partisan beliefs are calcified, and the person can learn very little from new data,” Westen said.
The study has potentially wide implications, from politics to business, and demonstrates that emotional bias can play a strong role in decision-making, Westen said. “Everyone from executives and judges to scientists and politicians may ‘reason’ to emotionally biased judgments when they have a vested interest in how to interpret the facts,” he said.
Coauthors of the study include Westen’s colleagues in psychology Pavel Blagov and Stephan Hamann, as well as Keith Harenski and Clint Kilts of psychiatry and behavioral sciences. The authors presented their findings in January at the annual conference of the Society for Personality and Social Psychology.