Lightning will never strike the same place twice. Humans only use 10% of their brain. So-called “right brained” people are more creative and intuitive and “left brained” people are quantitative, analytical thinkers.
While many may believe these statements, none of them are true.
In fact, the Empire State Building is struck by lightning on average 25 times a year, humans use far greater than 10% of the brain and differing personality traits cannot be chalked up to the dominance of one half of the brain.
At face value, these misconceptions may seem harmless. But they belong under a much more ominous umbrella — misinformation.
Misinformation is no new phenomenon. People have been spreading falsehoods and deceptions for as long as the tools have been available to them.
Whether it be fear mongering, propaganda or simple rumors, misinformation has made its dark mark throughout history. It was at the root of catalyzing the enlightenment era, the spark of the Spanish American war and the core of Nazi antisemitic propaganda.
Today, misinformation persists with newer, prettier and more pervasive packaging – resulting in the deception of millions and misperceptions potent enough to steer society in a direction of information consumption that’s more dangerous than ever before.
Social media’s contribution
Most of the news University of Wisconsin junior Gage Kent consumes is through social media, neglecting traditional media like TV or newspapers. He’s drawn to the convenience of scrolling through Twitter and clicking on articles to get the information he needs.
Still, he remains cautious of what he reads on the internet.
“Misinformation is everywhere,” Kent said. “I take things at face value, but I do that with a grain of salt.”
Kent is not alone.
Roughly 1 in 5 Americans primarily get their news from social media and 51% of Americans say they sometimes encounter made-up news. A 2018 study from MIT found falsehoods are 70% more likely to be retweeted on Twitter than accurate information.
Assistant professor in Life Sciences Communications Sedona Chinn studies the social dynamics of scientific uncertainty in the media and the natural tension between evolving scientific knowledge and people’s desire for concrete information. Across her research, she’s found an overwhelming trend — the rise of social media as a dominant communication avenue may aid the spread of misinformation.
Content on social media is largely user-created and shared from personal connections, Chinn said, which is allowing posts to spread information that would otherwise be filtered out from traditional media organizations. The ability to like, comment and share stories contributes to a whole new path of information and misinformation circulation — whether it is shared from close relations of friends and families or from influencers or celebrities.
“On the one hand, social media can be very empowering, particularly for groups who’ve been marginalized, whose stories have been often suppressed,” Chinn said. “And on the other hand, you’re having a proliferation of people sharing ideas that are not accurate and or hateful.”
Algorithms, which select the content on users’ feeds, tend to prioritize outrage and high engagement, as well as content users’ are more likely to engage with based on their interests, UW Life Sciences Communication professor Dietram Scheufele explained.
This leaves people vulnerable to tailored news that can use someone’s ideology to further perpetuate inaccuracies.
Even those cognizant of their information streams can be susceptible to engage in motivated reasoning — the natural human tendency to process information that agrees with one’s own cognitive biases. This leaves just about anyone vulnerable to believing misinformation, meaning Kent and others’ vigilance to avoid misinformation still may not be enough with the overwhelming flood of information on social media and the internet today.
“We’re not living in an environment anymore where science journalists curate news for us, at least not for the majority of us,” Scheufele said.
Social media may have changed the information landscape by blurring the line between truth and fiction, but misinformation spreaders often capitalize on the inherent advantage of the nature of uncertainty.
Chinn said uncertainty is baked into the scientific process. In politicized settings where people don’t know what to believe, a lot of misinformation will be reframed or taken out of context to use people’s uncertainty in an oversaturated media landscape to serve ulterior motives.
Many fear what they don’t know, so uncertainty in news media can lead to emotional engagement. In studying how emotions can motivate the spread of misinformation, Chinn said the posts which get the most engagement are highly emotional. Emotions can drive people to share stories indiscriminately, Scheufele said. For example, fear could have been a factor early in the pandemic leading users to share news on the off chance it was true.
“[Emotional content] is often misinformation as well because strong emotions, whether positive or negative, can disrupt people’s critical thinking skills,” Chinn said.
While the tools provided by social media and the nature of human knowledge make a great recipe for misconceptions, the problem is larger than not knowing certain facts.
‘Lives at Risk’
When falsehoods are spread on an individual level, it can seem relatively uninfluential. Yet when these inaccuracies become widely perpetuated, it can lead to broader impacts on the fabric of society — disrupting public discourse as well as faith in established institutions and scientific evidence.
The societal implications of misinformation can be seen in scientific issues like the COVID-19 pandemic. Amid the mysteries in the early days of the pandemic, the World Health Organization declared the overflow of COVID-19 information had caused an infodemic, or a state of receiving too much information — including false or misleading information — which causes confusion and risk-taking behaviors.
It would later be discovered that COVID-19-related misinformation in the first three months of the pandemic may have led to nearly 6,000 hospitalizations and around 800 deaths.
“During COVID our problem was, yes, to develop vaccines and therapies, but we actually did pretty well on that,” Scheufele said. “The second part of the problem though, is none of these vaccines and therapies are useful if people either don’t see the problem or are not willing to take them.”
Tug of War: Fate of Wisconsin wolves caught between hunters, Tribes, ecologists and cattlemen
Director of the Center for Journalism Ethics Kathleen Culver said researchers learned a lot from the pandemic about how misinformation can spread. Whether it be innocent exchanges of false information or targeted disinformation efforts, she said there are a host of consequences.
“I think it’s critical because again, as we see during the pandemic, you actually have lives at risk because of misinformation,” Culver said.
Concurrent with challenging science, the dangers of misinformation poses a threat to democracy as civilians aware of fake news become less concerned about what to believe and more about who to believe, which is just as much a problem as the content itself.
As major concerns over fake news become more salient, the issue goes beyond not knowing the facts. Just the idea of fake news existing sows mistrust in the media, Chinn said, leading to broader concerns about reaching a social consensus on trustworthy sources of information.
Without reaching an agreement on what’s true and not, a democracy can’t even begin to have productive discussions, Chinn said.
Mistrust in the media disrupting political discourse can have very tangible effects in our democracy. According to UW political behavior and mass media expert Michael Wagner, the repercussions of the constant onslaught of information is evident in recent elections in Wisconsin.
Despite strong evidence of an accurate and fair election, continued voter fraud investigations and assertions attempt to cast doubt over the result.
Long after the vote was counted, a Republican-backed investigation in Wisconsin — a key swing state — looked for a basis to decertify the results. Yet, nonpartisan lawmakers and experts in the state have repeatedly stated that decertification would be impossible.
A fundamental requirement of democracy is agreeing on the rules of an election, but Wagner said Wisconsin is struggling to clear that bar as nearly two years after the election, efforts continue to interfere with the public’s faith in the presidential election result.
“It’s something that I think a lot of people who care about the sustainability of democracy are really worried about right now,” Wagner said.
While election misinformation hits close to home in Wisconsin, it can also reach an international stage.
School of Journalism and Mass Communication professor Young Mie Kim studies targeted disinformation campaigns and their effects in elections. She found evidence of Russian actors using targeted ads to interfere with the 2016 election.
Russian actors would masquerade as American political interest groups, Kim said, sharing Facebook ads designed to spark incivility surrounding controversial topics, like LGBTQ+ rights, immigration policy or conspiracy theories about democrats.
Kim found a significant effect of the disinformation — targeted voter suppression.
The targeted ads designed for voter suppression employed a variety of tactics including encouraging a boycott of the election, emphasizing the barriers to vote or promoting a third party candidate. Kim said they found a small percentage of voter suppression across the whole study. For the targeted non-white demographics, they saw the pattern increase even more.
“If you think about vote margin in battleground states, in 2016, [the impact] is not so negligible,” Kim said.
While disinformation campaigns attempting to influence American elections pose a threat to democracy, the tactics employed by these actors are only possible with social media. People’s social media feeds get infiltrated with false information, making the platforms inconsistent sources of the truth that can lead to individual and societal impacts.
A healthier media landscape
More Americans view fake news as a major issue facing the nation than terrorism, illegal immigration, racism or sexism, according to a study from the Pew Research Center, and 56% of those surveyed believe the problem will only get worse in the next five years.
Given misinformation has persisted through history, Wagner said he’s unsure the problem will ever be completely solved.
“I don’t think it’s possible to change the hearts of the people who are willing to share disinformation,” Wagner said.
Still, some social science research is looking to alleviate the problem by better understanding the flow of information. Scheufele said collaboration between policy makers, social media platforms and academia is necessary to tackle the problem.
Not enough, not yet: UW makes progress on sexual assault response, but shortcomings persist
For academic institutions like UW, Culver said universities should provide students with resources for media literacy and teach them how to manage information streams and what to believe in the media. Additionally, public opinion leaders, such as the chancellor, should speak openly on important matters.
“I think institutions like [UW] have an obligation to the truth in the same way journalism has an obligation to the truth,” Culver said.
As for the government, Kim said regulation should focus on making social media platforms a healthy environment, which could be accomplished by increasing transparency about data usage and advertisements on social media.
In the mission for social media transparency, it is now required in advertising law that advertisers have a substantial basis for the claims they make, and they must clearly provide disclosure if the ad’s content could be misleading.
But social media platforms must also shoulder some of the burden in moderating content. Chinn said tech companies should at least be flagging misinformation on their platforms, but since this takes a lot of resources, companies drag their feet in creating the monitoring infrastructure.
From a legal standpoint, platforms have historically denied liability for misinformation because they do not create the content presented on their platforms. But Chinn said increased efforts to pursue defective product lawsuits could pave a path forward to accountability.
“If a toaster explodes, you can sue the toaster company,” Chinn said.
For example, a 2021 lawsuit argued Snapchat and two other messaging apps were “inherently dangerous” since the platforms did not honor their claims to enforce safeguards against cyberbullying. The lawsuit, raised after a teen who was cyberbullied on the apps took his own life, ensured the “toaster companies” in this case were held accountable for the explosion.
Culver said individual users should also be held accountable for certain behaviors and should lose their account if their behaviors violate platforms’ terms of service.
“If you told me five years ago that we would be in a place where people could be spreading misinformation that could harm people’s very lives, and they could do that without any consequence, I would have told you you’re nuts,” Culver said.
Users have another role to play in keeping themselves — and others — well informed.
It may seem insignificant to tackle misinformation on a personal level, but Chinn said recent research is focusing on user correction where users comment and provide corrective feedback to weaken misleading posts.
Improving society’s media habits is still an uphill battle for researchers and individuals alike. People like Kent, who obtain most of their news through social media, may not entirely realize the impact of their personal social media habits — making the path to eradicating the influence of misinformation a long one.
“I don’t share news stuff a ton, but when I do I usually just see a crazy headline and I go ‘woah that’s crazy.’ And I send it to my brother,” Kent said. “That’s something I’m trying to be better about.”