Misinformation in the Age of Social Media and COVID-19

Curtis Lin

Social media pulled up on the phone on Wednesday, March 14, 2018. (Photo by Curtis Lin)

By Abhilasha Khatri, Investigative Editor

 

“Fake news,” is a term that gets thrown around a lot, and it is often done with a joking undertone. Although “fake news” is not a new concept, as the distortion of news to sell stories has existed as long as newspapers have existed, it is more relevant than ever due to social media. This is because social media has the power to spread misinformation at a rate not previously possible. It gives everybody the power to publish, not just accredited newspapers, and new technology allows for more convincing deception such as distorted video “proof” of an incident.

Facebook was recently exposed by a company whistleblower for prioritizing engagement over managing divisive and false content. The social media platform has been linked to other incidents, such as the January 6th insurrection on the U.S. Capitol, the ethnic cleansing of Rohingya people in Myanmar and mob lynchings in India.

Glen Feighery, an associate professor of communication at the University of Utah, said the problem with social media when it comes to news is that it is based on a business model that prioritizes engagement over promoting the truth.

“If you can imagine a flowchart, so if engagement is the business model … what engages people?” Feighery said. “And the answer to that is, engagement equals provocative content … it’s the emotional stuff, it’s the polarizing stuff, it’s the extremist stuff.”

The power of social media’s impact on how people consume news is a new concept, and as such, it has brought forth new questions about how it should be regulated.

“Mark Zuckerberg himself has testified before Congress and I think he reiterated that he’s asking to be regulated … of course, that’s overdue, but then how on earth do you regulate a company that has 3.5 billion users across Facebook and Instagram and WhatsApp?” Feighery said.

Misinformation is relevant not just because of Facebook’s scandals, but also because of its implications with public perception of COVID-19. Throughout the course of the pandemic, information surrounding COVID-19 — from warnings of DNA altering and microchip implants to false treatments such as drinking bleach and taking medicine intended for livestock like Ivermectin — has been disproved by scientists.

Grace Haws, a first-year chemistry major at the U, said they have heard such myths be believed and circulated by family members.

“My uncle got COVID, and I have [another] uncle who’s a doctor … and he was asking my uncle, legitimately like, ‘Hey, can you prescribe me some hydroxychloroquine?’” Haws said. “My grandparents always are like, did you know about this conspiracy or whatever? I just started to tune it out because I’m like, I know what you’re saying is wrong.”

In terms of what sources their family members get news from, Haws said both social and mainstream media contribute to their belief in misinformation.

“My uncle definitely watches Joe Rogan on the regular … and I know that my grandparents watch like Fox News … they’re not going out of their way to find both sides of stories,” Haws said. “With my grandparents, it’s the mainstream media … my uncle, he’s a bit more tech-savvy, so he will go on social media and kind of like, find these offshoot articles.”

In recommending ways to stay protected from misinformation, Feighrey suggests diversifying sources of news.

“It’s easy just to stare at your phone all day and it’s easy to let your social media be your news … I don’t care if it’s Fox News, or MSNBC or Al Jazeera or the New York Times, Wall Street Journal, Associated Press, Washington Post, anywhere, you know, you should be getting at least something that’s reputable news,” Feighrey said.

Haws says when it comes to their family members, they don’t make the effort to change their minds anymore.

“It’s not worth it … I can give them all the facts, all the evidence or whatever,” they said. “And they’ll be like, well, you just don’t understand like, they just won’t … once they are in that mindset, they won’t believe me, no matter what I throw at them.”

Sara Yeo, an associate professor of communication at the U, has done research on factors that influence public opinion of scientific issues. In a written statement, she said research has shown that simply stating the facts is not enough to change someone’s mind.

“We know this as the knowledge deficit model, which I have written about with colleagues,” Yeo said. “Knowledge does matter, of course, but it is not the only factor that informs people’s attitudes toward science. There is ample evidence that people’s values and other factors influence how they seek information about science … and their opinions toward scientific issues.”

Yeo said there are strategies that seem promising in reaching people.

“There is evidence to suggest that using narratives (i.e., storytelling) has some positives, and the PNAS paper to which I referred … offers some ideas around the use of humor as a way to ‘disarm’ people who might be guarded about the issue,” Yeo said.

According to Yeo, a challenge to combating misinformation is that putting the responsibility entirely on individual consumers is unrealistic.

“We have limited ability to recognize misinformation,” she said. “Moreover, we often lack motivation to parse and identify misinformation — research in human cognition offers ample evidence that the ways in which we seek and process information are not necessarily conducive to discerning misinformation. Many of these processes are not conscious processes.”

Despite this limitation, Yeo said promoting media literacy and using emotion and humor in addition to knowledge are all potential tools to combating misinformation.

 

[email protected]

@a_khatri_news