Tim Squirrell is a PhD candidate in Science and Technology Studies at the University of Edinburgh. His research focusses on construction and negotiation of authority and expertise on the internet, with a focus on fitness and nutrition communities.

Cultural Cognition of Scientific Consensus

Cultural Cognition of Scientific Consensus

In this post I want to summarise a paper by Kahan et al, entitled "Cultural Cognition of Scientific Consensus". It's one of the papers that I recommend most often to my students and friends to read, because it helps to give some explanation of how people can differ so radically on "what the evidence shows" and "what experts agree on". This is, I believe, one of the most important issues of our time, insofar as we generally like to think that societies make policy about risks and threats based on the available evidence (at least to some extent), but the ability to do so is in some sense contingent on having populations that agree on what that evidence shows.

Because I believe it's important and useful to understand, I'm presenting this summary of what is quite a long paper which I imagine not every has the time or resources to read and digest.

A link to the paper is here for those who have institutional access: http://www.tandfonline.com/doi/abs/10.1080/13669877.2010.511246

What follows is a summary of the main themes and findings of the paper. Much of the methodology section is excluded, as I imagine anyone reading this is likely to want the TL;DR of the paper.

Cultural Cognition of Scientific Consensus:

The TL;DR:

  1. Cultural Cognition of risk is the  tendency of individuals to form risk perceptions that are congenial to their values.
  2. Cultural cognition thesis: individuals are psychologically disposed to believe that behaviour they and their peers find honourable is socially beneficial, and behaviour that they find distasteful is socially detrimental.
  3. Cultural cognition shapes individuals' beliefs about the existence of scientific consensus.
  4. Three areas are explored: climate change, the disposal of nuclear wastes, and the effect of permitting concealed possession of handguns.
  5. Public debates about science rarely involve open opposition to science itself; rather, they feature disagreement about what the scientific evidence really shows. Individuals disagree on what scientists are telling them: they are not ignorant to or indifferent to what scientists.
  6. The authors extend this to risk: individuals selectively credit or dismiss evidence of risk in patterns that fit values they share with others.
  7. This happens through an availability heuristic: when thinking about a controversial issue, we might perform a mental survey of experts we have seen express opinions on the issue, and our perception of their consensus (or lack thereof) is coloured by a tendency to more readily recall instances of experts taking on a position that is congruent with our cultural predisposition.
  8. There is a strong correlation between individuals' cultural values and their perceptions of scientific consensus on risks known to divide people of opposing worldviews - people who have hierarchical and individualistic worldviews disagreed substantially with those holding egalitarian and communitarian worldviews on the state of expert opinion on climate change, nuclear waste disposal, and handgun regulation.
  9. When asked to evaluate whether an individual of elite academic credentials was a "knowledgeable and trustworthy expert", people answered based on the fit between the position the expert was depicted as adopting and the position usually associated with the subject's worldview.

The more detailed version:

How might cultural cognition shape beliefs about expert consensus?

  1. Availability heuristic: when thinking about climate change we might perform a mental survey of experts we have observed offering opinions on the issue. "Consensus" comes from being able to recall individuals, groups or institutions taking positions in one direction or the other, and the cultural cognition thesis suggests that we more readily recall instances of experts taking a position that is congruent with our cultural predisposition.
  2. Perceptions of credibility: we more readily grant expert status and the knowledge and trustworthiness that goes with it to sources we perceive as sharing our worldviews. These sources are therefore likely to be overrepresented in our inventory of experts, bolstering the effects of (1) and meaning that individuals of opposing outlooks will have different impressions of what "most" experts believe.
  3. Biased assimilation: we tend to pay attention to information in a way that reinforces our prior beliefs, and this in turn leads to forming skewed assessments of the authority of would-be experts.
  4. Search bias: we tend to search out information congruent with our cultural predispositions, and so we are more likely to work hard to find expert opinion supportive of our existing perceptions.

 

Measures subjects' cultural values along two dimensions:

  1. Hierarchy: hierarchy-egalitarianism - attitudes towards social orderings that connect authority to stratified social roles based on immutable characteristics e.g. gender, race, class
  2. Individualism: individualism-communitarianism - attitudes towards social orderings that expect individuals to secure their own wellbeing without assistance from society, versus those that assign society the obligation to secure collective welfare and override competing individual interests

Scientific consensus, in these experiments, is cashed out as "the majority of scientists agree with this proposition".

Results:

  1. Base level:
    1.  55% perceived scientific consensus on rising global temperatures; 33% reported perceiving division;
    2. 45% perceived scientific consensus on anthropogenic global warming; 40% reported perceiving division;
    3. 25% perceived agreement on safety of geologic isolation of nuclear waste; 46% perceived division;
    4. 26% perceived agreement on crime-reducing impact of concealed carry laws; 41% perceived division.
  2. Hierarchical individualists:
    1. 56% believe scientists are divided on global warming, another 25% say most experts disagree that global warming is happening.
    2. 55% believe that most experts are divided on anthropogenic climate change, and another 32% believe most experts disagree that it's happening.
  3. Egalitarian communitarians
    1. 78% believe scientists agree global warming is happening, 68% it's anthropogenic.

Similar patterns could be found for concealed carry laws and nuclear waste.

Two key results in support of the basic hypothesis that scientific opinion fails to resolve societal dispute because culturally diverse individuals form opposing perceptions of what experts believe, and that individuals systematically overestimate the degree of scientific support for positions they are culturally predisposed to accept:

  1. Strong correlation between individuals' cultural values and their perceptions of scientific consensus on risks known to divide people of opposing worldviews - people who have hierarchical and individualistic worldviews disagreed substantially with those holding egalitarian and communitarian worldviews on the state of expert opinion on climate change, nuclear waste disposal, and handgun regulation.
  2. When asked to evaluate whether an individual of elite academic credentials was a "knowledgeable and trustworthy expert", people answered based on the fit between the position the expert was depicted as adopting and the position usually associated with the subject's worldview.

The authors propose three ways of improving risk communication based on the data above:

  1. Identity affirmation: individuals are less dismissive toward information about risk that affirms their cultural values rather than threatening them.
  2. Pluralistic advocacy: individuals are more open-minded about information if they believe that there are experts of diverse values on both sides of the debate.
  3. Narrative framing: individuals assimilate information by fitting it to pre-existing narrative templates which give it meaning. Risk communication needs to evoke narrative templates that are culturally congenial to target audiences.

I hope this summary is helpful to a few people. Dan Kahan's work is fascinating, and I'd highly recommend seeking out more of it.

Practising Reflexivity in Researching Nutrition

Practising Reflexivity in Researching Nutrition

On Teaching

On Teaching