In a recent publication, University of Wisconsin Life Sciences Communication assistant professor Sedona Chinn and University of Michigan Communication and Media assistant professor Ariel Hasell look into the effects of “doing your own research” on COVID-19.
The study involved surveying participants about their beliefs about COVID-19 and trust in science institutions after conducting their own research. Since the data collected was from a survey, the conclusions drawn are associations, not causal relationships, Chinn said.
Though the research was centered around DYOR for COVID-19, Chinn said she and Hasell had been seeing people use similar language regarding other health and wellness topics.
‘Technoableism is everywhere:’ The importance of recognizing, creating disability technology
“What struck us was that it [language being used] wasn’t necessarily ‘experts are bad and evil [and] you shouldn’t listen to them,’ but it was more reflecting these ideas that your own experiences and intuition and your own research are just as valid as listening to scientific evidence or the advice of experts,” Chinn said. “And we thought that was an interesting idea because there’s a lot of tensions around it.”
The tensions, Chinn said, on the one hand, include DYOR causing people to become more engaged. For example, by knowing who and what will be on a ballot. On the other hand, DYOR may also promote skepticism of vaccines, for example.
Once the pandemic occurred, Chinn said DYOR gained momentum as people were stuck in their homes with uncertainties and fears and had the internet to turn to for communication. Conspiracy theorists like QAnon and the following coverage by journalists also led to more public attention to DYOR, Chinn said.
The report found that DYOR perceptions only accounted for about 1% of the variance in COVID-19 misperceptions and trust in science, meaning the perceptions of DYOR are small but may accumulate over time,according to the publication.
This means between December 2020 and March 2021 — the three-month period when Chinn and Hasell were looking at COVID-19 beliefs — they were surprised to see there was a change in trust in science, Chinn said. The element of surprise stemmed from the fact that over three months, trust in science wasn’t expected to change much.
Measuring trust in science once in December and again in March revealed trust in science in December predicted trust in science in March, Chinn said. In other words, people’s previous beliefs are the best predictors of future beliefs. But even while controlling for this, Chinn and Hasell found there was a change.
“Strong support of doing your own research was supported with people becoming a little less trusting of science over time, whether that was because of the research they were doing or for another reason — we’re not really sure,” Chinn said. “Three months is a relatively short period of time, especially considering the duration of the pandemic, and so this could add up and become cumulative over time.”
Part of this lies in the hands of journalists and especially sensational or overdramatized headlines, Chinn said. This can lead to viewers coming to incomplete or inaccurate conclusions before reading an article, which may not even be misleading in itself.
In other words, while the article may be saying one thing, the headline may be saying another, leaving people confused or misled, Chinn said. This can also leave people with news fatigue, steering people toward DYOR.
UW physics course connects science and art
“[W]hile it might be effective at grabbing people’s short-term attention on social media, I think people get news fatigue, they get burnt out, and they might say this kind of reporting isn’t doing it for me, I’m going to go do my own research to try and figure out more accurate information or less sensationalized information or information that’s coming from a source that I trust more,” Chinn said.
Some types of misinformation people come across might also be rooted in more certain science than others, recent UW Life Sciences Communication PhD graduate Nicole Krause said. In other words, not all misinformation is created equal.
Experts sometimes try to reduce the effects misinformation has on those who come across it, but even then this might cause collateral damage or lessen trust in science and scientists, according to a publication completed by Krause and colleagues.
According to the publication, corrective interventions may succeed in reducing false beliefs about a topic but they might also have unintended consequences — or collateral damage — on other beliefs and attitudes.
“What we found was that attempts to correct the false belief that mRNA vaccines contain live virus did succeed in correcting their corrective effort, but that specific style of corrective from the CDC and the World Health Organization made some of the participants in our study think that live virus vaccines were risky, so that’s an unfortunate consequence of that corrective,” Krause said.
Others may also perceive some scientists who fact check or correct misinformation as stepping over their boundaries or limiting free speech depending on the claim they aim to correct, Krause said.
This isn’t to say all corrective strategies will be unsuccessful. Rather, Krause said some corrective strategies work very well for certain subgroups of people. There also isn’t a broad public that would be “uniformly affected” by a corrective strategy, Krause said.
“Some corrective strategies that work very well and don’t damage trust in science among some subgroups might [damage trust in science] among others, and it ends up being a risk-benefit calculation that you do with deciding whether to implement a corrective strategy or not. I think it depends on how they’re delivered,” Krause said.
But correcting fact-checking involves ethical implications, Krause said. While tech companies like Google and Facebook are incorporating fact-checking and correctives into their search results, the ethical question of who gets to define what’s false and limit discourse remains.