When you’re communicating, whether about the frustration of finding facial hair stubble in your bathroom sink or the importance of addressing climate change, it’s useful to think not only about the idea you want to get across, but also how you want to get it across. Which words do you want to use, or which ones do you want to avoid, for fear that they’ll make your spouse or conversation partner feel defensive or closed-minded? How do you want to bring up this topic? What other situations do you want to compare it to?
We’re accustomed to framing our everyday conversations carefully in order to maximize the chances of a desired outcome, like a clean bathroom sink, and minimizing the chances of an undesired one, like offending. We need to use this same meta-cognitive strategy — framing — when we communicate all science, and especially when communicating science that some audiences may want to resist.
I’ve created this handout to give an overview of framing in the context of published research. What does research tell us about how we should communicate issues like the importance of vaccinations or addressing climate change? The handout includes takeaways from each of the topics to help science communicators apply research on the science of scicomm.
What other strategies would you like to learn more about? I’m brainstorming my upcoming handouts and would love to hear from readers about topics from the science of science communication that would be most helpful.
America’s kind of tense right now. Leading up to and following the November 2016 election, there’s a lot of talk of “the two Americas” and “the Divided States of America.” Americans are divided on a lot of issues, including scientific topics like vaccine safety and global warming. To many, it’s surprising that we disagree about these things because according to scientists who research these topics, there are no debates at all: vaccines do not cause autism and humans are responsible for global warming.
I’m a PhD student in Cognitive Science, a firm believer in the scientific method and basing beliefs and actions on evidence. I highly value scientific funding, vaccinations, and measures that reduce the effects of climate change. As Americans, we have freedom of speech, and we should exercise that freedom to speak up when scientific knowledge and interests are being trampled on. I agree with the ideas expressed in blog posts like The War on Facts is a War on Democracy and I’m a Scientist. This is what I’ll Fight for and many of the ideas that continuously populate threads on Twitter like #defendscience and #resist. But I’m much less enthusiastic about the widespread use of a war metaphor to get those ideas across.
Metaphors shape thought
The metaphors we use to describe complex social problems actually shape the way we think about them. For example, when crime was described as a beast ravaging a town, people tended to suggest harsh law enforcement policies — similar to how they’d likely react to a literal beast ravaging their town. On the other hand, when that same crime was described as a virus, people suggested fewer harsh enforcement policies. Instead, they turned their focus to curing the town of problems that may underlie the crime, like improving education and welfare services.
People make inferences in line with the metaphors used to describe complex issues, so it’s important to reflect on what the war on science implies. It does have some helpful implications. Wars are serious, and often require urgent action. These are probably the messages that those who perpetuate the war on science want us to infer, even if not consciously.
But the war also suggests that there are enemies and casualties. There are two sides locked incombat, and neither will back down until they win (or they’re decimated). I like this quote from A Gentleman in Moscow, a novel I just happened to be reading while working on this post: After all, in the midst of armed conflicts, facts are bound to be just as susceptible to injury as ships and men, if not more so. In other words, we sometimes do stupid things in wars. We shirk thoughtfulness and conscientiousness, and instead we just fight. As I see it, our current political situation (for lack of a better word) needs all the thoughtfulness and conscientiousness we can give it.
I recently expressed my concern in a conversation on Twitter:
@mgpineau Nice piece- I've been hesitant to use the war frame for the current mess though. What about it works for you?
The war metaphor challenges those who are not already on the “side” of science. It tells them they’re the enemy. When people feel that they’re being attacked, even idealistically, they’re likely to strengthen their stance and gear up to fight back. No matter how many scientists tweet about science or participate in the March for Science on Earth Day, people who have found themselves on the “anti-science” side of this war are not going to decide all of a sudden that climate change must be real after all or that they should rush their kids to the pediatrician for overdue vaccines (especially if we tell them we’re marching to fight the war on science!). People who have already been labeled as the enemy of science may as well go out and buy a new gas guzzler and decide that their kids are just fine without vaccines.
If we want to stop thinking about ourselves as engaged in a war on science, we need an alternative. Proponents of and believers in science are experiencing a sort of struggle, but it doesn’t have to be a fight between the left and right, Democrats and Republicans, Coastal Elite and Middle America. Maybe we can reframe the situation as a challenge that unites all humans. Science communicators want to share how important it is to address climate change and to have children vaccinated for the good of all people. We can all be on the same side, working to better the world we live in, and it’s important that we convey that message in our communications.
Referring to the movie Hidden Figures, NPR blogger Marcelo Gleiser points out that if there is a central lesson in the movie, it is that united we win; that what makes America great is not segregation and intolerance, but openness and inclusiveness.
I considered the possibility that guiding people to trust empirical evidence and the scientific process might be better framed as a puzzle — a challenge, no doubt, but at least everyone’s working toward a common goal.
@RoHendricks Appropriateness seems as important as entailments right now. Is puzzle too mild e.g. this is not a game?
Marisa makes a really important point. The peacekeeper in me would love a frame that emphasizes hey, guys! We’re all in this together!, but that ship may have already sailed. At this point, it’s important not to downplay the gravity of discrediting and distrusting science. This is not a game.
I’ve had quite a few conversations on the war on science, but I still don’t have a one-size-fits-all framing suggestion for talking about America’s disconnect in belief in science. But when we’re considering talking about this issue as a war, it’ll be helpful to step back and assess our goals and the potential consequences of the words we use.
Right now, there are deep social and political divides in American society — and though it’s crucial to stand up for what we believe in (especially science and facts!), we should be careful about taking up arms in a war on science that might deepen those divides.
I welcome other comments on the framing of the war on science. Do you find the war helpful? Why? Are there other frames we could use to avoid deepening ideological divides?
Humans are currently in a war against global warming. Or is it a race against global warming? Or maybe it’s just a problem we have to deal with?
If you already consider climate change a pressing issue, you might not think carefully about the way you talk about it – regardless of how you discuss it, you already think of global warming as a problem. But the way we talk about climate change affects the way people think about it.
For scientific evidence to shape people’s actions – both personal behaviors like recycling and choices on policies to vote for – it’s crucial that science be communicated to the public effectively. Social scientists have been increasingly studying the science of science communication, to better understand what does and does not work for discussing different scientific topics. It turns out the language you use and how you frame the discussion can make a big difference.
The paradox of science communication
“Never have human societies known so much about mitigating the dangers they faced but agreed so little about what they collectively know,” writes Yale law professor Dan Kahan, a leading researcher in the science of science communication.
Kahan’s work shows that just because someone has scientific knowledge, he or she won’t necessarily hold science-supported beliefs about controversial topics like global warming, private gun possession or fracking.
Instead, beliefs are shaped by the social groups people consider themselves to be a part of. We’re all simultaneously members of many social groups – based, for example, on political or religious affiliation, occupation or sexuality. If people are confronted with scientific evidence that seems to attack their group’s values, they’re likely to become defensive. They may consider the evidence they’ve encountered to be flawed, and strengthen their conviction in their prior beliefs.
Unfortunately, scientific evidence does sometimes contradict some groups’ values. For example, some religious people trust a strict reading of the Bible: God said there would be four seasons, and hot and cold, so they don’t worry about the patterns in climate that alarm scientists. In cases like this one, how can communicators get their message across?
A growing body of research suggests that instead of bombarding people with piles of evidence, science communicators can focus more on how they present it. The problem isn’t that people haven’t been given enough facts. It’s that they haven’t been given facts in the right ways. Researchers often refer to this packaging as framing. Just as picture frames enhance and draw attention to parts of an image inside, linguistic frames can do the same with ideas.
One framing technique Kahan encourages is disentangling facts from people’s identities. Biologist Andrew Thaler describes one way of doing so in a post called “When I talk about climate change, I don’t talk about science.” Instead, he talks about things that are important to his audiences, such as fishing, flooding, farming, faith and the future. These issues that matter to the people with whom he’s communicating become an entry into discussing global warming. Now they can see scientific evidence as important to their social group identity, not contradictory to it.
Let me rephrase that
Metaphors also provide frames for talking about climate change. Recent work by psychologists Stephen Flusberg, Paul Thibodeau and Teenie Matlock suggests that the metaphors we use to describe global warming can influence people’s beliefs and actions.
The researchers asked 3,000 Americans on an online platform to read a short fictional news article about climate change. The articles were exactly the same, but they used different metaphors: One referred to the “war against” and another to the “race against” climate change. For example, each article included phrases about the U.S. seeking to either “combat” (war) or “go after” (race) excessive energy use.
After reading just one of these passages, participants answered questions about their global warming beliefs, like how serious global warming is and whether they would be willing to engage in more pro-environmental behaviors.
Metaphors mattered. Reading about the “war” against global warming led to greater agreement with scientific evidence showing it is real and human-caused. This group of participants indicated more urgency for reducing emissions, believed global warming poses a greater risk and responded that they were more willing to change their behaviors to reduce their carbon footprint than people who read about the “race” against global warming.
The only difference between the articles that participants read was the metaphors they included. Why would reading about a war rather than a race affect people’s beliefs about climate change in such important ways?
The researchers suggest that when we encounter war metaphors, we are reminded (though not always consciously) of other war-related concepts like death, destruction, opposition and struggle. These concepts affect our emotions and remind us of the negative feelings and consequences of defeat. With those war-related thoughts in mind, we may be motivated to avoid losing. If we have these war thoughts swimming around in our minds when we think about global warming, we’re more likely to believe it’s important to defeat the opponent, which, in this case, is global warming.
There are other analogies that are good at conveying the causes and consequences for global warming. Work by psychologists Kaitlin Raimi, Paul Stern and Alexander Maki suggests it helps to point out how global warming is similar to many medical diseases. For both, risks are often caused or aggravated by human behaviors, the processes are often progressive, they produce symptoms outside the normal range of past experiences, there are uncertainties in the prognosis of future events, treatment often involves trade-offs or side effects, it’s usually most effective to treat the underlying problem instead of just alleviating symptoms and they’re hard to reverse.
People who read the medical disease analogy for climate change were more likely to agree with the science-backed explanations for global warming causes and consequences than those who read a different analogy or no analogy at all.
Golden past or rosy future?
Climate change messages can also be framed by focusing on different time periods. Social psychologists Matthew Baldwin and Joris Lammers asked people to read either a past-focused climate change message (like “Looking back to our nation’s past… there was less traffic on the road”) or a similar future-focused message (“Looking forward to our nation’s future… there is increasing traffic on the road”).
The researchers found that self-identified conservatives, who tend to resist climate change messages more than liberals, agreed that we should change how we interact with the planet more after reading the past-focused passage. Liberals, on the other hand, reported liking the future-focused frame better, but the frames had no influence on their environmental attitudes.
And the frames didn’t have to be words. Conservatives also shifted their beliefs to be more pro-environmental after seeing past-focused images (satellite images that progressed from the past to today) more than after seeing future-focused ones (satellite images that progressed from today into the future). Liberals showed no differences in their attitudes after seeing the two frames.
Many climate change messages focus on the potential future consequences of not addressing climate change now. This research on time-framing suggests that such a forward-looking message may in fact be unproductive for those who already tend to resist the idea.
There’s no one-size-fits-all frame for motivating people to care about climate change. Communicators need to know their audience and anticipate their reactions to different messages. When in doubt, though, these studies suggest science communicators might want to bring out the big guns and encourage people to fire away in this war on climate change, while reminding them how wonderful the Earth used to be before our universal opponent began attacking full force.
Climate change (is it happening? how problematic is it? and are humans responsible?) is a partisan issue. Work by Dan Kahan (which I’ve written about before) shows that conservatives are more likely than liberals to believe that climate change is not a result of human activity and that if unchanged, it will not be as destructive as many people claim. Researchers Matthew Baldwin & Joris Lammers explore the possibility that partisan differences in beliefs about climate change might result from differences in the way conservatives and liberals tend to think about time (their temporal focus).
Their starting point was that previous research has shown that conservatives focus more on the past than liberals do. Then they tested two competing frames: one was future-focused (“Looking forward to our nation’s future… there is increasing traffic on the road”) and the other was past-focused (“Looking back to our nation’s past… there was less traffic on the road”). Each participant read just one of these, and then reported their attitudes about climate change and the environment. They found that conservatives reported liking the past-focused message better than the future-focused one and also reported higher environmental attitudes after the past- compared to the future-focused frame.
They replicated these findings in additional experiments with variations. For example, in one test, instead of using linguistic frames to draw attention to either the past or the future, they used satellite images, either showing a progression from the past to today or a forecasted progression from today to the future. Again, conservatives reported more proenvironmental attitudes after viewing past-focused images than future-focused ones.
Next they investigated the temporal focus that real environmental charities tend to use. Not surprisingly, they found that the charities’ messages disproportionately express future consequences, with less focus on the past. Following up on this, they presented participants with money that they could divide among two (fake) charities (one whose message was strongly past- and one whose message was strongly future-focused), or they could keep some or all of it. They saw each charity’s logo and mission statement (the past-focused one stated: “Restoring the planet to its original state” and the future one: “Creating a new Earth for the future”).
Conservatives donated more to the past- than the future-oriented charity. Liberals did the opposite. Further, looking at just the past-oriented charity, conservatives donated more than liberals did. Looking just at the future-oriented one, the opposite pattern emerges. This is a very beautiful interaction (plus the researchers did a few other experiments with slightly varied methods and a meta-analysis, all of which add some weight to these findings).
Considering the finding that climate change communications rely heavily on future-focused appeals, these findings should really make us pause. Is it possible that climate change issues themselves may not actually be what divides conservatives and liberals so much, but instead the way they’re communicated might be driving much of the disagreement between them? My intuition is that framing is not entirely to blame for conservatives’ and liberals’ divergent beliefs about climate change, but this work shows that it may be a big part of the story. It certainly won’t hurt for communicators to start diversifying our temporal frames for discussing climate change.
We humans have collectively accumulated a lot of science knowledge. We’ve developed vaccines that can eradicate some of the most devastating diseases. We’ve engineered bridges and cities and the internet. We’ve created massive metal vehicles that rise tens of thousands of feet and then safely set down on the other side of the globe. And this is just the tip of the iceberg (which, by the way, we’ve discovered is melting). While this shared knowledge is impressive, it’s not distributed evenly. Not even close. There are too many important issues that science has reached a consensus on that the public has not.
A common intuition is that the main goal of science communication is to present facts; once people encounter those facts, they will think and behave accordingly. The National Academies’ recent report refers to this as the “deficit model.”
But in reality, just knowing facts doesn’t necessarily guarantee that one’s opinions and behaviors will be consistent with them. For example, many people “know” that recycling is beneficial but still throw plastic bottles in the trash. Or they read an online article by a scientist about the necessity of vaccines, but leave comments expressing outrage that doctors are trying to further a pro-vaccine agenda. Convincing people that scientific evidence has merit and should guide behavior may be the greatest science communication challenge, particularly in our “post-truth” era.
Luckily, we know a lot about human psychology – how people perceive, reason and learn about the world – and many lessons from psychology can be applied to science communication endeavors.
Consider human nature
Regardless of your religious affiliation, imagine that you’ve always learned that God created human beings just as we are today. Your parents, teachers and books all told you so. You’ve also noticed throughout your life that science is pretty useful – you especially love heating up a frozen dinner in the microwave while browsing Snapchat on your iPhone.
One day you read that scientists have evidence for human evolution. You feel uncomfortable: Were your parents, teachers and books wrong about where people originally came from? Are these scientists wrong? You experience cognitive dissonance – the uneasiness that results from entertaining two conflicting ideas.
One way we subconsciously avoid cognitive dissonance is through confirmation bias – a tendency to seek information that confirms what we already believe and discard information that doesn’t.
This human tendency was first exposed by psychologist Peter Wason in the 1960s in a simple logic experiment. He found that people tend to seek confirmatory information and avoid information that would potentially disprove their beliefs.
The concept of confirmation bias scales up to larger issues, too. For example, psychologists John Cook and Stephen Lewandowsky asked people about their beliefs concerning global warming and then gave them information stating that 97 percent of scientists agree that human activity causes climate change. The researchers measured whether the information about the scientific consensus influenced people’s beliefs about global warming.
Those who initially opposed the idea of human-caused global warming became even less accepting after reading about the scientific consensus on the issue. People who had already believed that human actions cause global warming supported their position even more strongly after learning about the scientific consensus. Presenting these participants with factual information ended up further polarizing their views, strengthening everyone’s resolve in their initial positions. It was a case of confirmation bias at work: New information consistent with prior beliefs strengthened those beliefs; new information conflicting with existing beliefs led people to discredit the message as a way to hold on to their original position.
Overcoming cognitive biases
How can science communicators share their messages in a way that leads people to change their beliefs and actions about important science issues, given our natural cognitive biases?
The first step is to acknowledge that every audience has preexisting beliefs about the world. Expect those beliefs to color the way they receive your message. Anticipate that people will accept information that is consistent with their prior beliefs and discredit information that is not.
Then, focus on framing. No message can contain all the information available on a topic, so any communication will emphasize some aspects while downplaying others. While it’s unhelpful to cherry-pick and present only evidence in your favor – which can backfire anyway – it is helpful to focus on what an audience cares about.
For example, these University of California researchers point out that the idea of climate change causing rising sea levels may not alarm an inland farmer dealing with drought as much as it does someone living on the coast. Referring to the impact our actions today may have for our grandchildren might be more compelling to those who actually have grandchildren than to those who don’t. By anticipating what an audience believes and what’s important to them, communicators can choose more effective frames for their messages – focusing on the most compelling aspects of the issue for their audience and presenting it in a way the audience can identify with.
In addition to the ideas expressed in a frame, the specific words used matter. Psychologists Amos Tversky and Daniel Kahneman first showed when numerical information is presented in different ways, people think about it differently. Here’s an example from their 1981 study:
Imagine that the U.S. is preparing for the outbreak of an unusual Asian disease, which is expected to kill 600 people. Two alternative programs to combat the disease have been proposed. Assume that the exact scientific estimate of the consequences of the programs are as follows: If Program A is adopted, 200 people will be saved. If Program B is adopted, there is ⅓ probability that 600 people will be saved, and ⅔ probability that no people will be saved.
Both programs have an expected value of 200 lives saved. But 72 percent of participants chose Program A. We reason about mathematically equivalent options differently when they’re framed differently: Our intuitions are often not consistent with probabilities and other math concepts.
Metaphors can also act as linguistic frames. Psychologists Paul Thibodeau and Lera Boroditsky found that people who read that crime is a beast proposed different solutions than those who read that crime is a virus – even if they had no memory of reading the metaphor. The metaphors guided people’s reasoning, encouraging them to transfer solutions they’d propose for real beasts (cage them) or viruses (find the source) to dealing with crime (harsher law enforcement or more social programs).
The words we use to package our ideas can drastically influence how people think about those ideas.
We have a lot to learn. Quantitative research on the efficacy of science communication strategies is in its infancy but becoming an increasing priority. As we continue to untangle more about what works and why, it’s important for science communicators to be conscious of the biases they and their audiences bring to their exchanges and the frames they select to share their messages.
A team of researchers representing a range of academic departments across most of the schools in the University of California (UC) system recently published a chapter summarizing what we know about efforts to communicate climate disruption and how we can improve on them. It’s full of useful information (especially in the tables, which include things like common climate myths vs. facts and existing communication programs in the UC system). An overarching theme that I’ll focus on is that framing matters.
What’s a frame?
Picture frames often enhance the image inside. Frames can draw attention to the parts of the image that lie inside them and obscure or detract from the parts that lie outside. Linguistic frames do the same thing. The chapter refers to framing as “an effective communication tool for drawing attention to, legitimizing, and providing an interpretive context for abstract, complex, or unfamiliar information” (p. 9). For example, one person might frame a medical procedure by saying that it has a 70% success rate, while another might frame that same procedure as having a 30% failure rate. Although they both reflect the same information, each highlights something different — either success or failure — and psychology research has shown that in many instances, people reason differently when they encounter different frames for the same idea. Truly complex concepts like climate change can’t be communicated without framing, because it’s impossible for a communication to portray everything imaginable that’s known about a topic without highlighting some information and downplaying others.
The power and ubiquity of framing show us that facts alone are not enough. Frames used to communicate about climate disruption need to be selected conscientiously in order to give people a sense of why they should care about the issue and what they personally can do about it. Climate change can be framed by highlighting the human health issues it creates, the economic gains that can be realized by addressing it, or effects on local versus global levels. Climate change can also be framed using images.
This image makes me think, damn, we need to save the Earth. If that one didn’t work for you, maybe this one will:
Considerations for Frames
There is no one-size-fits all frame for motivating people to care about and act on climate change. Instead, communicators need to know their audience and anticipate the audience’s reaction to different messages. Tailoring frames for specific audiences becomes even more challenging when audiences are culturally diverse (a very notable point, since the authors are all from California, the most populous and diverse state). But it’s a challenge worth taking up. In the state of CA, for example, a message about rising sea levels may impact someone living on the coast more than someone living inland in an area affected by drought. Anticipating what matters to an audience can help communicators choose the most appropriate frames.
Religion provides an additional opportunity for framing. The major world religions emphasize humans’ responsibility to care for their natural world, and religious leaders have begun explicitly urging their followers to take this message seriously in the context of climate change. Unlike religion, climate change is often associated with political beliefs (almost half of Republicans are skeptical of climate change while just over 10% of Democrats are). In order to get more people to acknowledge the gravity of climate change and the actions we need to take to prevent disaster, communicators should focus on reducing the political divide on the issue, for example having prominent Republican groups and “opinion leaders,” people who have clout in their communities (such as Bible study or PTA leaders), speak about the urgency of addressing global warming.
Economics and business frames are also important to hone. Many people currently see addressing climate change as bringing about job losses, but in reality job prospects in the renewable energy sector are greater than those for traditional energy sources. Communicators need to emphasize these facts as well as highlighting the major companies that are already committed to improving energy practices.
Climate change is one of the most contentious issues nationally (and globally, at least in places where people have even heard of it), and communicating any controversial issue presents challenges (the subject of a chapter in the National Academy of Science’s guide for effective science communication, which I summarized previously). Adequately addressing climate change may involve more scientific innovations, legislation, and a lot of behavior changes… but we won’t get there if we don’t also focus on communicating the gravity of the issue and what can be done about it.
The National Academy of Science published a thorough (127-page) guide for communicating science effectively, with a detailed description of what the science of science communication has already revealed, but more importantly, with an agenda for the future of research on this topic. It’s long but useful, so I’ve broken it down into an abridged guide. Yesterday I posted my distillation of chapter 1, and today’s focus is chapter 2.
Chapter 2: The complexities of communicating science
Public engagement: seeking and facilitating the sharing and exchange of knowledge, perspectives, and preferences between or among groups who often have differences in expertise, power, and values
Public engagement is important for goals of generating excitement, sharing info needed for a decision, and finding common ground on an issue among diverse stakeholders.
Challenges posed by scientific content
Uncertainty. People generally dislike uncertainty and avoid ambiguity. As a result, it might seem like avoiding talking about the uncertainty inherent in science will be a productive way to communicate. However, avoiding discussion of uncertainty is a problem too, since it creates a false sense of certainty among people, and if (or when) new findings arise that require original information to be revised, people are likely to lose trust in the communicators. So far, presenting relevant narratives seems to be an effective way to engage audience with scientific issues, helping them to remember and process the information, but we need more research on the role of narratives for communicating science and on broader best practices for communicating scientific uncertainty.
Different audiences, different needs
Aspects of audiences that affect science communication help explain why the same information can be understood very differently by different people:
Prior knowledge of science
Plus, scientific knowledge alone doesn’t necessarily lead to holding positive attitudes toward science. Instead, someone’s characteristics, background, values and beliefs, and the information they receive from the media all influence the role their scientific knowledge has on their attitudes.
Ability to understand numeric information
When communication strategies rely on quantities, rates, or probabilities and they take into account that people (including scientists, particularly when the issue is outside their area of expertise) struggle to make sense of numeric information, they are often more successful than just presenting the numbers. In health communications, at least, the following strategies have proven helpful:
Don’t avoid the numbers – provide them.
Reduce the cognitive effort required by the consumer
Explain what the numbers mean
Draw attention to important information
Ways of interpreting new information
Everyone has their own beliefs about that way the world works, and these beliefs play prominent roles in making sense of new information. We also rely heavily on mental shortcuts when we encounter new information:
Heuristics: We often believe information that is consistent with our preexisting beliefs and information that we encounter more often than inconsistent and less frequently encountered info.
Emotion: Our initial emotional reactions to new information can shape the way we continue to think about that information, and some research suggests that we tend to pay more attention to negative than positive information.
Motivated reasoning: We’re biased to make sense of information in a way that is consistent with our immediately accessible beliefs and feelings.
Cognitive dissonance: we’re able to hold two conflicting thoughts, but that often makes us feel uncomfortable, and we try to resolve that conflict for ourselves. If you really love Big Macs, for example, and you also know that health professionals say Big Macs are not good for you, you might feel some dissonance. You can either change your behavior (stop eating Big Macs) or justify your behavior by tweaking your belief (well, I walked into the restaurant instead of using the drive thru, so I got my exercise and can probably have the Big Mac OR well, those scientists are studying mice so really, does that apply to me? OR well, I’m poor and a Big Mac is cheap OR, or, or…).
Presenting information in different forms
The way we present information affects the way it’s received.
Framing is used when information is presented in one way to influence how people interpret it. When issues are communicated about in terms of being a priority or a problem, or when specific causes and solutions are focused on, the issue is being framed. Framing is an inherent part of persuasion and communication about complex topics: You can’t possibly present an issue in its entirety, so a communicator must decide what to highlight and what to downplay. When frames are relevant to the way a person already thinks about the world, they’re most likely to be influential.
Gain/loss framing: A 70% success rate and a 30% failure rate are mathematically the same, but depending on the context, may actually influence people in different ways. However, whether framing an issue in terms of potential gains or potential losses influences people more seems to vary based on the issue at hand, so we need more research to understand when each framing is most beneficial.
Emphasis framing: Complex issues are often presented as story lines that suggest different trains of thought, which in turn emphasize some features of an issue over others. In particular, scientific information is often presented in terms of personalized stories (episodes) or more generally (themes). Again, the issue at hand determines how productive emphasizing episodes vs. themes will be, so we need more research.
Trust and credibility of science communication
People primarily rely on different social information to figure out what and whom they believe about scientific issues:
Having common interests, in that the communicator and the audience both want the same outcome from the communication
This point relates to the earlier points on the ways we encounter new information. When scientific information conflicts with someone’s political ideology, they might not only reject the information, but their trust in the communicator might also decline.
Perceived expertise which is not equivalent to a communicator’s actual expertise.
Applying the lessons of large-scale science communication efforts
It’s important for audiences to receive sufficient exposure (aka, a lot) to information so that it can reach enough of the target audience and bring about change.
Communication that’s provided before people form strong opinions on a topic is likely to be more educational than communication after, so timing matters. It can be helpful to expose people early to counterarguments for the misinformation they may eventually receive, as a way of “inoculating” them from misinformation.
Duration is also crucial: “long-term and comprehensive approaches” will likely be successful and necessary for communication goals. Isolated attempts are not enough.
An overall theme of this chapter is that because of the many complexities of communicating science, “…an effective science communication strategy will be iterative and adaptable… it will evolve over time based on lessons learned about what is and is not working, as well as shifting needs and opportunities.” (p. 35)
Tomorrow I’ll post a condensed guide to Chapter 3: The Nature of Science-Related Public Controversies.