Cultural Cognition: Interview with Dan Kahan

Cultural Cognition: Interview with Dan Kahan

According to cultural cognition theory, we are all biased, even when it comes to facts. The tendency of people to conform their beliefs about facts to values that define their cultural identities explains how exposure to sound climate science can cause people with opposing values to become even more polarized on the issue. 

► See our Key Takeaways from the interview

► Check out a collection of cultural cognition studies from Dan Kahan

Dan Kahan is the Elizabeth K. Dollard Professor of Law and Professor of Psychology at Yale Law School, where he is a member of the Cultural Cognition Project. To explain how worldviews come into play as people assess risk and interpret facts, the project measures people’s preferences for how society should be organized and places them along two dimensions of hierarchy-egalitarianism and communitarian-individualism. Kahan’s work on how cultural cognition causes people to interpret information in a way that reinforces their predispositions has significant implications for climate communicators; it explains why a scientific consensus on climate change doesn’t resonate with large segments of the American public. Here are excerpts from my conversation with him earlier this week.

Q: How do people respond when you explain to them about cultural cognition and the role it plays in the climate debate?

A: People tend to be surprised. People normally are asking the question why are people who disagree with me so confused, why aren’t they seeing the facts and how is it possible for people to be misleading them, and the nature of our work makes it possible to see that people who disagree with others on climate change are really forming their opinions in pretty much the same way as anyone else is—the same kinds of influences are operating on both sides. The idea that somehow there’s some kind of external force that’s misinforming people or that the other side won’t listen to facts or is dumb—that’s the kind of conclusion that we tend to challenge.

There’s a famous study from the 1950s called ‘They Saw a Game’ and in it the researchers ask students at two Ivy League colleges to watch a film of a football game between their two colleges and they ask them to see if the referee had made mistakes on certain controversial calls, and the students from one school, Princeton, said he made all these mistakes when he called penalties on Princeton and the students from Penn said he made all these mistakes when he called penalties on Penn. That’s motivated reasoning; people are fitting their perceptions of what it is they see in the film to the stake that they have in affirming their membership in that group, and that’s something I think that we all know happens….What we don’t always know is when it’s happening. Even if you know about the phenomenon, it can still be a surprise to find it happening to you at a particular time. When we present our research showing that there is this tendency of people to fit their perceptions about risk and other kinds of policy consequential facts to these group commitments, people are open to that; it’s when we talk about particular issues that they’re likely to be surprised that we have evidence that that’s what’s going on—that’s what’s going on with climate change, or disposal of nuclear waste, or with the effect of permitting people to carry concealed weapons in public.

Q: We’ve posted a collection of your research and the latest one is about the impacts of a geoengineering narrative on the climate debate. What did you discover?

A: The study was a test of a hypothesis of how you might be able to present information so that it’s less likely to trigger these kinds of dynamics, less likely to trigger the motivated cognition. What’s triggering it is that people have this kind of stake in membership of the group. Obviously you don’t want to believe the people you trust are stupid, you don’t want to feel that you’re going to form a belief that could estrange you from other people, and so there’s kind of a defensive push back when people are presented with information that seems to challenge these predominant group beliefs.

The source of the threat typically with climate change is that people assume the response to climate change is to have more restrictions on commerce such as emission controls. That threatens values and commitments of people who have certain kinds of cultural outlooks that prize markets, individual initiative and ingenuity, so they’re more likely to be defensive about that.

When we presented information about the risks of climate change we asked our subjects to look at a study that suggests that carbon dissipation in the atmosphere is slower than scientists had thought and say whether they thought it was valid. We saw lots of cultural polarization when we had first told our subjects that they should think about emission controls as a solution to climate change. In another condition we gave our subjects information about geoengineering, our hypothesis was that it wouldn’t be as threatening to the groups that otherwise push back on information about climate change. It shows that solutions aren’t just about restricting the kinds of activities that they think are important, but actually about employing some of them because these people tend to be very pro-technology.

The result of the study was that the individuals who had been shown first the information about geoengineering weren’t as dismissive toward the evidence that climate change is actually happening. So, the idea is if you know that people are reacting to information based on whether they think it’s consistent with their team’s position, can you change their understanding of what the relationship is between that information and their team. There’s no reason why anybody has to think that climate change is about whose team is winning; we’re all going to lose if we don’t have a way to think about the information in an open-minded way.

Q: Is there a concern with this example, that even if geoengineering is a way in for some of these groups, is it worth communicating the idea that humans can fix every problem?

A: What exactly the impact of exposure to information about geoengineering will be on people’s perceptions of what the consequences of dealing now with the climate change risks are whether it in fact will cause them to essentially reduce their perception that there really is anything to worry about—‘we can always fix things’—that’s an empirical issue too and that’s one of the arguments we were interested in testing too with this study. We found that the group that was exposed to the information on geoengineering wasn’t any less concerned about climate change, so we didn’t find evidence that it was diminishing that kind of motivation that people had to consider the need for solutions.

The most important message of the kind of work we do is that science communication can be studied scientifically and there are lots of hypotheses that can be formed but they can also be tested and it makes sense to try to get it sorted out, that if in fact there is some benefit to presenting information in a certain way, then we should.

Q: In Fixing the Communication Failures you write that “one possible way to control cultural cognition is to present information in a manner that affirms rather than threatens people’s values.” What are some other examples?

A: We’ve done a similar study with nuclear power—when people are made aware that nuclear power is also something people are considering when they think about the array of options, we see less cultural polarization. One of the ways in which people form this impression that the issues have a kind of significance for their group—one position or another is associated with their group and the other position with another group—just comes from the correlation they see between the perceived identities of advocates or communicators on the one hand and the kind of positions than they are taking. When people tune in and get the information about a science issue and they see everybody on this side is someone like me and everybody on that side seems to be somebody like that, that’s a communication failure. You don’t want to, in effect, pollute your communication environment with that kind of toxin, that us versus them. If it’s like that, then people when they are presented with information that challenges what they already believe, then they’re going to feel much more threatened.

Q: And this is why faith leaders can be more effective than the stereotypical environmentalist?

A: Well and all kinds of people. The truth is, we don’t need to hire actors or anything; it just is true that people of all kinds of backgrounds and outlooks have these positions and you ought to be including a diverse array of communicators when you’re trying to make people aware of what the best information is.

Q: Someone recently suggested to me that blaming scientists playing themselves for ineffective communications is akin to a battered wife syndrome…

A: The information that’s being conveyed to people has two aspects: one is the content of what you’re telling them about the kinds of public health risks that they face, but another aspect is what’s the cultural meaning of this issue: ‘what should a person like you think about it?’ It’s that second channel of communication that’s critical. It’s the information that was communicated along that dimension that generated the kind of self-reinforcing political dynamics that polarize people on climate change. And no scientist talking about his or her science was conveying anything about what it means to have a position on this or that issue. That happened for a whole bunch of reasons, but the idea that people were put off by scientists—most people couldn’t name a climate scientist to save their life. They probably could name one or two scientists, period. They’ve never heard, never seen, any of the information in the way a scientist would communicate it. These things filter out in a complicated way through the kinds of networks through the kinds of associations that ordinary people have, and what’s being transmitted there are these meanings.

Q: So how much is the research community as well as advocates embracing this idea and incorporating the idea of the cultural cognition into campaigns?

A: First of all, there’s more than one kind of dynamic obviously at work in the effective communication of science, including ways of presenting information so that it’s intelligible to people and they can comprehend the content of it; there’s more than one factor. The factor I’m talking about, the tendency of people to fit their perceptions of risk and relay facts to their identities or their values can be understood in related ways and there’s a family of methods and frameworks all of which have that same kind of dynamic behind it. The motivated cognition that people are trying to assess the fit between the information and their group identities, and I think that that idea is out there, it might even be the case too that more than understanding it’s hierarchical individualists versus egalitarian communitarians, somebody who’s out there, who’s trying to talk to say, farmers about adaptation or mitigation, should just know that the message has to be presented in terms that resonate with that person’s outlooks, and he or she, the communicator, will figure out what that is in a more fine-grained way and through experimentation him or herself what the way of talking is. So I don’t care if they know about the phrase cultural cognition, what I care about is they understand that the insight that people are reacting to information based on what they understand even unconsciously, it signifies about the status of their group, is really consequential.

There are lots of groups that understand these kinds of principles and are using them and as they use them, as groups that are engaged in education and outreach try to adapt their strategies to the kinds of things that social science research find out, they generate more evidence about what works and what doesn’t. That kind of reciprocal sharing of the information is a good formula for figuring out how to do this right.

Q: Our next Climate Access roundtable is about overcoming fatalism and I’m curious when people hear about this work whether it is making people more hopeful or more fatalistic?

A: There’s a problem. We all know that. The prospect that we’ll solve it depends on being able to understand what the source of the problem is and to understand that source in a way that’s precise enough to come up with strategies to address it. If people get the impression that our work is depressing, I think it’s because they haven’t really engaged enough or seen enough of it. At the same time that we’re able to identify mechanisms or dynamics that really do tend to make people fit the information you’re giving them to their values, we’re also identifying things that might help to avoid that. That’s a reason to be optimistic because things that we’ve been trying so far haven’t really been working that well.

As an example of what I’m talking about is a social scientist at University of Minnesota, Heather LaMarre, who studies satire, and one thing that she finds is the kind of defensive reactions I’m describing can be avoided to some extent if you’re communicating information in a kind of humorous or satirical way, people are drawn in by the story or by the satire, they’re trying to figure out what you’re saying. And before you know it, they’re laughing, and if they’re laughing then they’re not being defensive and you’ve kind of gotten around that barrier. That makes me pretty happy to hear somebody have an insight like that. And I think our work is also suggestive of lots of strategies that you can use to make things better.

Q: So what’s next for the cultural cognition project?

We’ve been looking at how cultural values interact with people’s ability to process quantitative information. One of my colleagues, Ellen Peters, is probably the very top researcher on numeracy, how differences in ability to make sense of quantitative information affect people’s perceptions of risk, and we find that that too is very much connected to the cultural cognition–that people who are more numerate can actually become more polarized. So we’re doing experiments that are looking at why it might be that people you might expect to be the most sophisticated consumers of evidence can nevertheless be the most polarized, and I think that’s important because those people are obviously going to be pretty influential in the formation of public opinion.
 


 

KEY TAKEAWAYS:

  • People react to information based on what they understand, even unconsciously, that information signifies about the status of their group.
 
  • The perceived identities of communicators affects how their messages are perceived; one way to mitigate public conflict over scientific evidence is to make sure that sound information is vouched for by a diverse set of experts.
 
  • People tend to push back when they are presented with information that appears to challenge their predominant group beliefs. So, present information in a manner that affirms rather than threatens people’s values.
 
  • It is a communications failure when people are presented with information that challenges what they already believe and it appears that everybody on one side is like them and everybody on other side seems like they are not. This can make people feel more threatened

 

Photo via (cc) Flickr user victoriapeckham

More