Press "Enter" to skip to content

Researchers are tracking another epidemic, too—of misinformation

Emma Spiro (left) and Kate Starbird (right)

(left to right): Doug Parry/Information School/University of Washington; Center for an Informed Public

By Greg Miller

When five researchers at the University of Washington, Seattle, launched the new Center for an Informed Public back in December 2019, they had no idea what was coming. The center aims to study how misinformation propagates and use the findings to “promote an informed society, and strengthen democratic discourse.” Now, just a few months later, the coronavirus pandemic is generating a tidal wave of information—some of it accurate, some not so much—that has saturated social and traditional media.

Two of the center’s founders—sociologist Emma Spiro and crisis informatics researcher Kate Starbird—are watching closely. By monitoring news reports and scraping massive amounts of data from social media platforms, they are examining how misinformation is spreading during the pandemic, and how scientific expertise factors into public perceptions.

“We’re trying to think about questions of how data and statistics are being used and debated in these conversations online, and what is the impact of that on public understanding and the way that people make decisions and take actions,” Spiro told ScienceInsider this week. 

Spiro and Starbird discussed lessons learned from the spread of misinformation in past crises, and some of the things they’re hoping to learn from this one. This interview has been edited for brevity and clarity.

Q: Why is misinformation so pervasive during a crisis?

Kate Starbird: Historically, a lot of misinformation is a byproduct of the natural response that people have to a disaster event. There’s a lot of uncertainty about the impacts of the event and what actions we can take to respond to it. That uncertainty contributes to anxiety, and in those conditions of high uncertainty and anxiety, people try to come together to try to make sense of what’s going on, to participate in what we call collective sensemaking. Rumoring is a part of that, as people try to find the best information. Sometimes rumors turn out to be false, but rumors can also turn out to be true.

Emma Spiro: It’s important to emphasize here that the process Kate was describing does help alleviate some of the anxiety people feel, because people can take actions and make decisions that are based on some communal group level understanding of what is currently happening. The other thing I would add is that sometimes the cost of not passing along information, even though you’re not sure whether it’s true or false, can be really, really high. You know, if there’s a flash flood warning and some town has to [decide whether to] evacuate, you want to err on the side of caution.

K.S.: For a lot of people, participation in the process is altruistic. We’re seeing that happen a lot in this current crisis as people pass along information because they think they can help their friends and family.

Q: We’re also seeing misinformation, or at least mixed messages, from the highest levels of government. What is the impact of that?

K.S.: When we see political actors contradicting the messages coming from organizations that are science-based, it contributes to a building of distrust. It makes me really nervous when elected officials or political appointees are contradicting their experts.

E.S.: That’s something that carries forward. This is not the last crisis we will experience and that undermining of scientific experts and the recommendations of our response agencies, that has long-term detrimental effects that I think we really don’t understand yet.

Q: What kinds of research questions are you trying to address during this crisis?

E.S.: One of the key questions on our mind right now is to understand how you have this intersection between scientific expertise on virus transmission and risk and things like that, and a social media environment where people can very easily gain influence—and what is the impact of that sort of attention seeking on the conversation? … If we find cases where it’s been problematic for certain individuals to gain so much attention, then we will think about ways to potentially mitigate that or give recommendations to some of our response agencies about ways they can promote more accurate information.

K.S.: We’re always interested in the dynamics of information flow, how information moves from social media to more traditional media and back again. And we’re trying to identify recommendations, whether it’s for tech companies or individuals or crisis responders.

Q: Kate, you recently wrote a Medium post that included recommendations for individuals. What are some of them?

K.S.: We should tune into how our anxiety is perhaps causing us to have unhealthy behaviors, whether it’s just spending too much time in these environments or spreading rumors. When we see something, we can reflect on where it came from and do a little research to figure out if it’s true or not.

Q: What do you see as tech companies’ responsibility here?

K.S.: Twitter has made some moves to try to get rid of certain kinds of misinformation and to surface the voices of the right experts, which I think are good moves. But I want to be careful about punishing people for sharing rumors or misinformation. I don’t think the platforms should do that. We see that in authoritarian states. It’s so important that people feel that they can share information, and sometimes they’re going to get it wrong. It’s a real balance there, and there are some really hard trade-offs.

Related


Source: Science Mag