There’s a reason scientists are marching on Washington on April 22nd. They’re sick of us. We aren’t listening to them – our opinions still differ from theirs and the politicians we elect don’t make policies based on their recommendations.
No issue is more significant of this struggle than climate change. According to many polls, including a Pew poll from 2016, only 48% of the American public believe climate change is due to human activity despite consensus among climate scientists. When asked about the findings of these scientists, only 27% of the public acknowledges the consensus.
The gap is huge between what we believe and what the scientific method as shown to be true.
As it turns out, many reasons…
- First, this shit is complex
- We have a faulty view of science
- Our brains can’t handle a problem like climate change
- So believe what our tribes believe
- Most influentially, what our political tribe believes
- Some of our tribes have a negative stereotype of environmentalists, and we don’t want to be one
First, this shit is complex
The Earth’s climate includes the oceans, wind, the biosphere, upper and lower atmosphere, solar effects, glaciers, clouds, evaporation and so. Add to these the complexities in determining the long-term historical average temperatures of the Earth, and predicting future temperatures based on very large number of assumptions parameters and variables.
The study of climate change, therefore, involves the advancement and integration of our knowledge in all these areas of science, which individually are very complex. This complexity has contributed to the high level of misconception and misunderstanding by the general public, by the media and even by those in positions of power. Its complexity has also contributed to the difficulties experienced by the scientific community to fully explain the science and convince the global community in a cohesive and comprehensive manner. [via Fair Observer]
We expect too much certainty from science
As I learned in Merchants of Doubt, denial of climate science has worked because we have an erroneous view of science:
We think that science provides certainty, so if we lack certainty, we think the science must be faulty or incomplete.
History shows us clearly that science does not provide certainty. It does not provide proof. It only provides the consensus of experts, based on the organized accumulation and scrutiny of evidence.
Sensible decision making involves acting on the information we have, even while accepting that it may well be imperfect and our decisions may need to be revisited and revised in light of new information. For even if modern science does not give us certainty, it does have a robust track record.
We have sent men to the moon, cured diseases, figured out the internal composition of the Earth, invented new materials, and built machines to do much of our work for us—all on the basis of modern scientific knowledge. While these practical accomplishments do not prove that our scientific knowledge is true, they do suggest that modern science gives us a pretty decent basis for action.
Out brains can’t handle a problem like this
We have many, many cognitive limitations that prevent us from thinking about and responding to climate change in a rational way. This list just maybe begins to scratch the surface:
Limitation 1: Human brains aren’t wired to respond easily to large, slow-moving threats.
Our brain is essentially a get-out-of-the-way machine. That’s why we can duck a baseball in milliseconds. Many environmentalists say climate change is happening too fast. No, it’s happening too slowly. It’s not happening nearly quickly enough to get our attention.
Limitation 2: We’re wired to avoid short-term loss
“Loss aversion bias” means we’re more afraid of losing what we want in the short-term than surmounting obstacles in the distance.
Limitation 3: We’re overly optimistic
Our built-in “optimism bias” irrationally projects sunny days ahead in spite of evidence to the contrary.
Limitation 4: We prefer information that confirms our pre-existing beliefs
We seek out information not for the sake of gaining knowledge for its own sake, but to support our already-established viewpoints. This is known as confirmation bias. We also pay far less attention to evidence that disputes our beliefs.
In the same vein, we also over-weigh early evidence (irrational primacy effect) and hold later evidence to a higher standard (disconfirmation bias):
How often have you heard climate change deniers claim that there isn’t enough evidence to support global warming? This is known as disconfirmation bias: When we are presented with evidence that disputes our beliefs, and we then hold that evidence to a higher standard. We suddenly start to dispute the thoroughness or the impartiality of evidence, in a way we would never do for evidence that does not dispute our pre-conceived beliefs. A disconfirmation bias is particularly tricky to spot because, in your own mind, it can give the impression that you are indeed being thorough and scientific. [via Fair Observer]
Limitation 5: We have trouble responding to or making decisions in uncertainty
Our brains respond most decisively to those things we know for certain. The more uncertainty that comes along the less we are able to act on what we know for certain. In science, there is often uncertainty, although that doesn’t mean there isn’t enough information to act upon or provide solid information.
Limitation 6: We discount the future
As shown in the classic marshmallow experiment, we prefer present over future.
There’s a two-year-old in the back of our minds that’s still there that we’ve learned to overrule that wants to have their one marshmallow now rather than wait for two marshmallows. Very few people on this planet want to destroy planet earth. It’s just that our other agendas get in the way of things that might have a longer time horizon.
Limitation 7: We’re biased by personal experience
Many of us don’t really see anything happening, so we question whether it’s actually happening.
The importance of experiential learning creates several challenges to a public consensus needed to implement meaningful climate change policy. In places where the weather actually is cooling over time, contradictions between personal experiences with local changes in climate and the scientific evidence for climate change seem settled in favor of personal experience. Changing this weighting in favor of scientific evidence will be difficult given the importance of personal experience.
Limitation 8: We don’t know what we don’t know, so we think we know…
This is called the Dunning Kruger Effect:
(…) people usually don’t know enough about a topic to realize that they don’t know much about it to make appropriate and rational decisions about it. Paradoxically, improving the skills of the participants, and thus increasing their competence, helped them recognize the limitations of their abilities and knowledge. In other words, people who know very little, think they know more than they actually do, while those who know a lot, think that they know less than they actually do.
Limitation 9: we have trouble responding to disturbing information that we can’t do anything about
We don’t want to believe climate change is happening, feel guilty that it is, and don’t know what to do about it. So we pretend it’s not a problem.
“Our response to disturbing information is very complex. We negotiate it. We don’t just take it in and respond in a rational way,” said Norgaard.
In order to have a positive sense of self-identity and get through the day, we’re constantly being selective of what we think about and pay attention to. To create a sense of a good, safe world for ourselves, we screen out all kinds of information, from where food comes from to how our clothes our made. When we talk with our friends, we talk about something pleasant.
The reason is that we don’t have a clear sense of what we can do. Any community organizer knows that if you want people to respond to something, you need to tell them what to do, and make it seem do-able. Stanford University psychologist Jon Krosnick has studied this, and showed that people stop paying attention to climate change when they realize there’s no easy solution. People judge as serious only those problems for which actions can be taken.
We believe what our tribes believe
Humans are essentially and necessarily social animals, and the need for belonging to a mob or a tribe is vital for identity and our instincts for survival. We also use our tribe to take cognitive shortcuts – if someone we trust has thought this through, we can align with them rather than think it through on our own.
Often times in determining who is credible, individuals will trust those who share similar world views and personal values. They tend to seek information congenial to their cultural predispositions.
“Cultural cognition” is the term used to describe the process by which individuals’ group values shape their perceptions of societal risks. It refers to the unconscious tendency of people to fit evidence of risk to positions that predominate in groups to which they belong. The results of the study were consistent with previous studies that show that individuals with more egalitarian values disagree sharply with individuals who have more individualistic ones on the risks associated with nuclear power, gun possession, and the HPV vaccine for school girls.
“In effect,” Kahan said, “ordinary members of the public credit or dismiss scientific information on disputed issues based on whether the information strengthens or weakens their ties to others who share their values. At least among ordinary members of the public, individuals with higher science comprehension are even better at fitting the evidence to their group commitments.” [via Yale]
We’ve politicized it
Our belief in climate science is heavily correlated with our political affiliation.
In 2012, Gordon Gauchat conducted a study that looked at the politicization of science in the public sphere from 1974 to 2010. Using data from the General Social Survey, he looked at group differences in trust in science and changes in those groups over time. According to Gauchat’s work, conservatives began the period with the highest trust in science relative to liberals and moderates—and ended it with the lowest.
As detailed in Merchants of Doubt, the Republican brand of climate science denial is rooted in economic idealism: doing something about climate change would mean standing in the way of the holy free market. The free market is king, apparently even when it comes up against facts.
The same Pew survey I mentioned at the beginning of this post breaks the numbers down by political affiliation: Of the 48 percent who believe the Earth is warming mostly because of human activity, seventy-nine percent of liberal Democrats held that belief, compared with 63 percent of moderate Democrats, 34 percent of moderate Republicans and 15 percent of conservative Republicans.
We have a stereotype of environmentalists, and we don’t want to be one
I learned a lot about this perspective by reading an evangelical Christian, conservative Republican’s take:
People who care about the environment are left-wing, socialist, former hippies who have no job and hate those who do. There were no clean-cut, all-American patriots camping out in the offices of Exxon Mobil. The message was clear, if you cared about the environment you aligned yourself with some pretty undesirable people.
He goes on to add that many people see environmentalists as alarmists, shame-peddlers, and atheists who worship nature (got me there!).
To care about the environment was to align with the lunatic fringe.