Scott McInerney (0:00)
Hello! This is the Badger Herald Podcast, and I’m your host, Scott McInerney. Today we’ll be talking about scientific misinformation. While it may seem as if it became problematic in recent years, it’s actually been around ever since humans had the tools to spread it. Today, it’s everywhere.
We’ll discuss how certain areas of science are more vulnerable than others. Still, scientific misinformation can have large impacts on society and public policy. Public health, COVID misinformation, vaccine hesitancy and climate change are just a few areas where people spreading falsehoods can have immense effects.
To get a better look at this issue, I spoke with Dietram Scheufele, a professor at the University of Wisconsin Department of Life Sciences Communication. We discussed his thoughts on scientific misinformation and society.
McInerney (1:00)
First off, Could you just introduce yourself, your position and your research interests?
Dietram Scheufele (1:05)
Absolutely. I’m Dietram Scheufele, a faculty member in the Department of Life Sciences Communication, and a lot of my work deals with why people make decisions that are ultimately inconsistent with the best available scientific evidence, or very often, inconsistent with what’s best for them. So I teach undergrad and grad courses here in the department.
McInerney (1:27)
Why is studying and understanding scientific misinformation important?
Scheufele (1:31)
The question of if it is important or not is still a little bit of an open question, and there are two big schools of thought right now in academia: One group that says, “Well, misinformation is really the problem for our lifetime, maybe second only to climate change. Because people, if they only had the right information, would make much better choices,” and so on and so forth. There’s another group, and I’m probably closer to that group, that looks at the last 40-50 years of research and says, “Well, we actually know that giving more information to people – and that’s true for all of us, for that matter – doesn’t make them make choices that are more in line with the best available science.” Why? Because there are lots of other things that go into our decisions, values, what everybody else is doing, and so on and so forth. So we may be fixing a symptom in misinformation, but the underlying problem of the polarization of our society and how we share information on social media is really the problem that we need to fix. And misinformation is just one symptom of that larger underlying problem.
McInerney (2:35)
Yeah, that’s very fascinating. Now to sort of hone in on COVID, specifically, which is obviously a very prominent topic: If you could turn back the clock to 2020, when there was all this information flying around about a disease, no one knows much about, knowing what we know now, what should we have done differently to combat COVID misinformation?
Scheufele (2:58)
It’s funny. It’s funny. We wrote a piece in April 2020, I think, in Slate, where we said, “There are a few things we might be doing wrong, and they’re not going to be trivial if we get them wrong.” One of them is, if we politicize, in a sense of partisan politics, COVID. So as long as we of course, we have to make political choices. Of course, we have to make policy choices, but if we keep those separate from partisan politics, we’re going to be fine. And of course we did. I mean, there is a lot of fault to go around, certainly with the previous administration, but also with academics and others who very much made this an issue that was affiliated with partisan politics. And so if I could go back, I would probably shorten that article just around that one bullet point saying, “Do not under any circumstances make this political.” Somehow we’ve managed to turn the idea that COVID is dangerous, that vaccines are safe, and that ultimately, this is a virus that we’re all fighting against together, this has become an issue of Democratic versus Republican politics. And that, of course, has been the most dysfunctional kind of process in this whole overall dumpster fire. That was COVID in this country.
McInerney (4:14)
Yeah, and so it seems to me you’re saying it’s more an issue of politicizing science? Would you say that misinformation plays a role in that process?
Scheufele (4:35)
Yes! Absolutely. So it’s partly about politicizing science, but science is always political, right? By definition, especially if science says, “Look, we know how COVID is going to play out and these are likely the best policy choices.” So yes, we’re inextricably linked to political choices and policy making. When I say political, I mean, partisan politics. I mean, science is a Democrat or Republican issue. Biden, for example, coming back into office and saying “science is back” was, in my opinion, a not very productive way of kicking off that debate, because it tells Republicans science is not for them, and if you voted Democrat, now there’s a president who brings science back into the White House. And of course, there’s not been a single administration, and I would include the Trump administration in that, that has been anti-science or that has not funded science, as certainly Congress did during those years.
Scheufele (5:23)
So then the question is, is that connected to misinformation? And that’s an interesting one. Again, I would highlight Biden, who accused Trump of telling people that they should inject bleach, which clearly was wrong. But the statement that Trump made about disinfecting people inside their body, or whatever he ended up trying to say, if you actually listen to the press conference, it was just some random set of statements that had no deeper meaning. The reason why we’re all talking about injecting bleach is because Twitter and the whole Twitter-sphere jumped on this and of course, freaked out and started error correcting and saying, “Well, this is misinformation, we need to fix this, we need to correct this!”
Scheufele (6:04)
Nobody would have even paid attention had we not started to correct this. Nobody would have even registered that statement. And there’s a very similar thing in the past we studied. When Sarah Palin, at the time, running as vice presidential candidate against Obama, was asked why she had foreign policy credentials, she basically said, “because I was a half term governor in Alaska and there are parts of Alaska that are really close to Russia, so it is always in the front of our thinking.”
Scheufele (6:30)
Tina Fey on Saturday Night Live turned that into “I can see Russia from my house,” which seven in ten Americans to this day are convinced is what Sarah Palin actually said. So this idea of us engaging either in a comedic way or in a corrective way, with what we think is misinformation, very often does the exact opposite. Meaning it ingrains into public consciousness in a way that just is, in that case, unfair to Sarah Palin or unfair to Donald Trump. And I don’t think either one of them needs a lot of defending for me, but it just illustrates the phenomenon really powerfully.
McInerney (7:06)
And a lot of what we’re talking about recently as COVID, and maybe some of these more controversial topics of science that can be politicized. So with that, COVID has been the center of the misinformation conversation lately. What are other areas of science where misinformation is a problem?
Scheufele (7:25)
I think there are a lot of areas of science where misinformation is rampant, and has been for a long time. There have been studies in the 90s that tried to show that eating genetically modified organisms, for instance, caused cancer. Those studies were retracted, republished, but highly publicized, even though they contradict the vast, vast, vast majority of research. Same thing, obviously, a lot of people are familiar with the Wakefield study in The Lancet on vaccines allegedly causing or being linked to autism. That also was retracted. He lost his medical license over that and fake data. So there’s been lots of misinformation going around. I would classify it in multiple ways. Number one is, does it actually matter for people’s behavior? If somebody thinks the earth is flat, I don’t care. Why? Because it’s unlikely to affect a lot of public health questions or their own health for that matter, unless they catapult themselves into the air with a rocket to see if they can see the horizon and then they crash into the rooftop of their car – which has happened, so I’m not making this up.
Scheufele (8:35)
But in general, does it have a public impact? Yes or no? And COVID obviously falls into the second category, meaning misinformation may have really detrimental effects on public health, and so on, so forth. But then number two is, I would say, you know, misinformation is a problem only if it actually is likely to get traction. Some outlandish claim about 5G causing, or helping create the COVID virus, unless we give it digital oxygen by constantly engaging with it, and actually giving it the credibility of being discussed in mainstream media, it’s going to basically be a very small bubble that’s not going to have a huge impact. On the other hand, if a piece of information gets endorsed by political candidates, or by partisan players, and all of a sudden it has a much higher potential for actually spreading. So that’s the second characteristic: how high is the likelihood of it spreading.
Scheufele (9:30)
For a lot of the scientific issues that we’re dealing with, the impacts are actually much more nuanced. So for GMOs, for instance, the idea that there are misperceptions and inaccurate perceptions about how safe it is to eat them has huge implications on American agriculture, on global markets, and so on and so forth. And so the impacts are largely economic in nature, and so it basically impacts the consumer behaviors.
Scheufele (10:00)
For COVID, the jury is out still a little bit. Are people, for example, not getting vaccinated because they don’t know any better? Or are they not getting vaccinated because of the strongly held ideological belief that also makes them reject information from the Biden administration or from the CDC that they’re seeing and have seen for a long time as being opposed to their candidate, meaning Donald Trump.
Scheufele (10:27)
So the misinformation again, may be an outcome of the exact same disposition or predisposition that these people have in protecting their partisan identity. And so if I’m a strong Republican, the thinking might go, “If I’m a strong Republican, I like Trump, Trump says that COVID is not real. And so I don’t believe in COVID.” So it’s not the misinformation about COVID that drives my behavior, it’s actually my partisan leaning. And yet we see a good example of that phenomenon is when we’re seeing people getting vaccinated in secret in southern states, where their social networks are all conservative and they want to do what everybody else around them does, because that’s one of the big drivers of behavior. But they also know that it’s actually good for them. And so they end up going out and getting vaccinated without telling their friends because they want to have the benefit without dealing with the social fallout or the political fallout that that may bring for them.
Scheufele (11:13)
COVID has just been the most recent example of a phenomenon that’s been around in science and elsewhere for a long, long time. The internet has made this commercially viable again, when in the traditional press, it just wasn’t anymore.
McInerney (11:30)
Would you say that the issue is getting more at trust in science and people’s faith, rather than focusing on specific individual false claims?
Scheufele (11:46)
Yes and no. The issue is partly about trust, but not in the way that we all think. If you actually look at the latest 2020 data from the General Social Survey, it’s survey data, national survey data that gets collected at The University of Chicago every two years. And the latest iteration shows that we’re seeing the widest gap between Democrats and Republicans’ trust in science in a long time. And for a lot of academics that result, and if you look at some of the journalistic write ups of this, the result meant, “oh, look, Republicans trust science less than they used to before.” And there’s some truth to that. There was less than 10%, but a significant decrease in trust in science among Republicans.
Scheufele (12:27)
For Democrats, though, the positive jump is much, much higher – it’s well above 10%. So Democrats are displaying now the highest levels of trust in science that they ever have displayed. Do they mean by that, gene drives? Do they mean nuclear energy? Do they mean areas of science that traditionally, liberals don’t like? No, what they mean is COVID. Science has now become a partisan badge for Democrats, that says, “we follow the science and following the science is a Democrat issue.” Again, I talked at the beginning about how that’s strategically unsmart because at some point, you will have to rely on centrist voters to actually win key races once you get out of the primary.
Scheufele (13:05)
But number two is we’re seeing two issues. One is that the gap is driven mostly by liberals. And number two, if you look at the independents, so that 40% of the US population are Independent, 30 Republicans, 30 Democrat, we have seven in ten Americans, because Independents and Republicans actually look very similar in trust in science. So we’re running a real risk of losing seven in ten Americans on the issue of science, because we’re basically portraying it as an elite, liberal leaning issue, which it is not. Science is an objective, neutral, and ideally, completely nonpartisan enterprise, which is why we claim that we have better ways of creating knowledge than anybody else. And number two is, even if it was, it would be politically unsmart to communicate it that way.
Scheufele (13:45)
That’s why yes, it is related to trust, but not in the way that we really want it to. And the ideal scenario is that at some point most Americans trust science overall, but also have questions around modern issues, modern scientific breakthroughs, like AI, like CRISPR, genome editing, arrays, and so the ideal scenario is that all of us are overall trusting with some level of reasonable skepticism, and that would be the ideal world that I would be hoping for, if we could do this all over again.
Scheufele (14:23)
This has been a long time coming, and you know, Facebook was created in 2004, and interestingly, I was just going back to some of our old stuff. We wrote a piece the year Facebook was created that says if we put people on social media, in social groups where they largely talk to each other, the democratic effects are going to be detrimental. They’re going to be highly problematic. This was not based on social media data, because social media didn’t exist. This was based on what we observed in surveys. And now where are we 18 years later, and this is exactly what happened. So a lot of people saw this coming. The problem was, of course, we have a media system that’s driven by profits. rather than by regulation and what’s best for democracy. That’s not good or bad, that’s just the way it is. But I think that’s also partly where the solution will have to lie – regulation or thinking of what we as a country want moving forward. But we’ll see how that’s gonna go.
McInerney (15:19)
Next, I spoke with Sedona Chinn, an assistant professor also in the Department of Life Sciences Communication. Professor Chinn teaches a course on the scientific misinformation in the media.
McInerney (15:32)
Great. Can we just start off? Could you introduce yourself, position your research interests?
Sedona Chinn (15:38)
Hi, I’m Sedona Chinn, I’m an assistant professor at the University of Wisconsin Madison, in the Department of Life Sciences Communication. And a lot of my research focuses on how people understand and make sense of different scientific uncertainties often in the realm of how people respond to scientific disagreement when they hear that scientists don’t agree on something, and increasingly looking at disagreements and uncertainty on social media and the role of non strategic or non scientific communicators like lifestyle influencers, in shaping attitudes towards science.
McInerney (16:12)
Great. And we’re just going to kind of start off with a somewhat of a big question here. So there is a scientific consensus that climate change is occurring and is caused by humans. Do you think individuals and groups challenging climate change consensus poses a major problem to society?
Chinn (16:29)
I think there are kind of two questions there. One is that climate change poses a significant risk to our well being in the future. And we need to, as humans, come together and take action to prevent the most harmful impacts of climate change, and to try and prevent further increasing temperatures. The other question is more of the social question about people who, you know, is it a huge social problem when people are disregarding a scientific consensus? And I think that comes down to why people are disregarding a scientific consensus, you know, does this represent a general pattern of rejecting all expertise across all domains? Or is this something that’s more specific, localized, targeted towards specific issues, specific politicized issues or specific issues that challenge people’s worldviews values? And what we find mostly is that denial of science or rejection of science is typically limited to a very, very small subset of science issues. And so that gives some hope for you know, that this isn’t maybe a widespread issue.
McInerney (17:42)
I know you work a little bit in sort of understanding when science gets politicized. Does that play a role in this particular issue here?
Chinn (17:45)
Completely. The challenge to tackling issues of climate change seems to be politicization. And that happened quite early on when climate change came to the sort of public arena back in the 80s. That efforts to politicize climate change have been driven in large part by political actors who have specific policy goals about how environmental issues, generally in climate change, should be addressed and what the role of government is in regulating those issues.
McInerney (18:20)
Great. And so Twitter is banning ads that go against the scientific consensus on climate change. It’s a new policy they recently added. And so do you see this as a step in the right direction?
Chinn (18:37)
That’s an interesting one. I’ve not heard about Twitter specifically doing this, but I’ve seen Pinterest banning all climate misinformation as well. That’s come out over the past couple of weeks. And one of the things that I think they’re trying to do is essentially limit exposure to misinformation. And when we think about correcting misinformation, we’re sort of trying to do two things at once. One is preventing other people from being exposed to the misinformation, which is what this ban would sort of be aimed at doing. And the other would be trying to correct people who already hold misperceptions, and this sort of ban is unlikely to correct those people who already hold misperceptions about climate science and the causes of climate change. So different strategies will be needed to actually address that audience and in fact, those audiences might see this ban as some form of censorship and experience some sort of oppositional reaction to it.
McInerney (19:40)
So influencers are becoming an increasingly relevant source of information. How big a role do they play in disseminating scientific misinformation?
Chinn (19:47)
That is a fascinating question that I really want to study more of. Because a lot of people think of lifestyle influencers as something very silly and superfluous and superficial and innocent. Yes, this isn’t necessarily all – they’re not trying to communicate strategically on particular scientific topics. But when you have food influencers talking about buying non-GMO organic produce for their recipes, or you have lifestyle influencers, fashion influencers talking about thrifting their fashion versus buying new, there is the opportunity for audiences, particularly audiences that aren’t very interested in scientific topics or like explicit scientific topics, to learn something about science, environment, or politics, by the sort of implicit messages or explicit messages embedded in this content. And it’s a really under researched area of science communication. And so I’m really excited to explore more of it.
McInerney (20:49)
You mentioned fashion and food as sort of topics of influencers. Where do you see other topics that might lead to different exposure to scientific content?
Chinn (21:00)
Yeah, there’s actually been a lot of journalistic work about how parenting bloggers and yoga influencers, some have been responsible for spreading misinformation around vaccination, particularly COVID vaccination and childhood vaccination, as well as promoting conspiracy theories like QAnon. That’s something that’s gotten a lot of attention in news media recently. And the other place, I would say, that people are getting a lot of science messages is in skincare content, that there is a lot of dermatologists and cosmetic chemists who are talking about the biochemistry of different skincare products and in 45-minute long YouTube videos while they’re reviewing products. And that’s often not seen as science content. But if you watch it, they have data visualizations, and infographics explaining how the molecular levels and molecular size of different ingredients are interacting with different levels of the skin. And there’s a potential for people to learn a lot about biochemistry from these videos, and potentially be inspired to pursue careers in that area, which is something I want to explore more.
McInerney (22:00)
Yeah, that’s really fascinating. And moving on. We have a hypothetical here for you: You’re placed in charge of managing scientific misinformation on a social media platform. You’re completely in charge, what steps would you take to sort of alleviate the problem.
Chinn (22:17)
And that’s, I think, the job that nobody wants. The first thing that is sort of to bear in mind is that there is no single approach that’s going to work for all audiences for all topics and for all platforms. And that’s why having some sort of variable approach, in the sense that you can adapt your strategies to different events that come up to different types of misinformation. And what we spend a lot of time talking about in my class on misinformation is that it’s really important to understand why people believe certain types of misinformation and why people might have different reasons for believing the same misperception. So any corrective strategy needs to be tailored in some way. So there’s, I talked about, you know, maybe removing some content, particularly content that is harmful, maybe down voting or down ranking content to prevent the spread of certain misinformation. But then when it comes to people who hold these misperceptions and who are seeking out this information, those require different corrective strategies. There’s some strategies around suggested content that you’ve sort of put into people’s feeds without them noticing through algorithmically putting correct information into the feed, or taking more explicit strategies and more explicit, sort of corrective interventions that need to be tailored again to why people are believing the misinformation. And which audiences are believing the misinformation for different reasons or seeking it out for different reasons. Basically, a multi-pronged approach.
McInerney (23:45)
Going back, you had mentioned that there’s one thing in preventing new exposure to misinformation and a whole nother thing is correcting misperceptions. Just a little bit more on the idea of correcting myths and misperceptions. How can individuals help that?
Chinn (24:13)
Yeah, I think that the default assumption or the default mode we want to go into, when we are talking with somebody who is doubting some sort of misinformation is that we want to go in hard and try and correct that as strongly as possible, and explain to them all of the reasons that they are wrong. And that is rarely effective, because people become very defensive. And nobody likes to be told that they’re wrong, let alone have to publicly admit that they’re wrong. So a lot of more effective corrective strategies take a bit more of a patient approach of building trust with the people around you of finding sources that they find to be credible, that correct their misperceptions. And, again, really understanding why people are believing this misinformation. Maybe it’s due to a particular negative experience that they have and they are particularly fearful. And so maybe it has to do with addressing that root cause. Maybe this is something about a piece of information that’s challenging their strongly held identities or values or belief system. And you need to find a way to fit that accurate information into their existing world views or values or belief systems and explain to them why this is not a threat to their identities or their values.
Chinn (25:20)
So you need to appeal to these different reasons about why people might believe in this perception in the first place. Understand that it’s a long game. This isn’t something you’re going to correct in one sitting. And the best approach I can offer when interacting with people sort of in person or online, is to approach it with an attitude of curiosity, rather than trying to immediately make it into an argument. Because you can go a long way in figuring out how people got there by asking them, you know, “Why did you come to believe this? Where did you see this information? That’s interesting. I’ve heard something different. Can you explain a bit more?” And getting them to explain it is sometimes a tactic for revealing to themselves where some faults and logic might be, but also gives you a better understanding of how they got to this position in the first place. Attitude of curiosity, patience, and focusing on the root causes of why people believe this misinformation.
McInerney (26:35)
That’s all for today. To learn more about misinformation, check out April’s feature in the Herald – Fact or Fiction: Worsening misinformation spread poses dangerous implications for society.
McInerney (26:40)
This episode of the Badger health podcast was hosted by me, Scott McInerney, and produced and edited by Lydia Larsen. Thanks for joining us!