Skip to main content

Meeting COVID-19 Misinformation and Disinformation Head-On

The U.S. needs a national strategy to combat health-related misinformation and disinformation.

Published

A Q&A WITH TARA KIRK SELL 

Long before SARS-CoV-2 was identified, misinformation and disinformation about a plethora of health topics have run rampant online.

There’s a distinct difference between the two: Misinformation is a broader classification of false or inaccurate claims shared largely unwittingly and without the intention to deceive. Disinformation is a specific subset of misinformation created with deliberate intentions to deceive. Both have caused significant and real harm throughout the COVID-19 pandemic.

While Facebook, Google, and Twitter are stepping up accountability measures on their platforms, truly addressing the problem requires larger solutions and a focused national approach.

The Center for Health Security’s new report, National Priorities to Combat Misinformation and Disinformation for COVID-19 and Future Public Health Threats: A Call for a National Strategy, offers a comprehensive plan for a national approach to stamping out mis- and disinformation. Tara Kirk Sell, a senior scholar at the Center and lead author of the report, answers some questions about both the problem and the proposed solutions.

Q&A

What are some common threads of mis- and disinformation in the context of COVID-19?

Usually during health emergencies we see 4 types of false information: mischaracterization of the disease or protective measures that are needed; false treatments or medical interventions; scapegoating of groups of people; and conspiracy theories—often about the existence or origin of the pathogen, profiteering, or politics. We did an analysis of misinformation during the Ebola crisis in 2014 and are seeing many of the same themes in this outbreak today.

There are four pillars in the proposed strategy:

  • Intervene against false and damaging content as well as the sources propagating it
  • Promote and ensure the abundant presence and dissemination of factual information
  • Increase the public’s resilience to misinformation and disinformation
  • Coordination of a national strategy that includes input from social and news media, government, national security officials, public health officials, scientists, and the public

What would it look like to treat this as an issue of national security?

Elevating it from a public health risk communication issue to something that acknowledges that the spread of false health-related information, especially during health emergencies, has implications for national security.

The National Security Council would be responsible for developing and overseeing the U.S. strategy for preventing and responding to the management of health-related misinformation and disinformation in public health emergencies, drawing on the expertise, implementation capabilities, and existing efforts of the departments of Health and Human Services, Defense, Homeland Security, the State Department, and the intelligence community.

How can platforms like Facebook and Twitter be held accountable?

It’s clear that we need active, transparent, nonpartisan intervention from online social media and news sources.

Many efforts have been made over the course of the pandemic to take action against false information and the people who spread it, but the problem is immense and continues to get worse. I don’t think that it makes sense to leave sole responsibility for decisions about intervention to technology companies during public health emergencies. Guidance for the implementation of interventions should be developed by a national commission that provides neutral, evidence-based guidance and recommendations in order to improve the health communication landscape. This commission made up of social media operators, communication specialists, public health experts, and bioethicists could help to formulate governance options, mitigation strategies, and interventions against misleading content that could undermine the public’s health.

What does it mean to prioritize risk communication for public health departments?

I have always felt that risk communication is disproportionately neglected when it comes to preparing for public health emergencies. It is clear that you can be the most prepared or have a great vaccine, but none of that can work if people aren’t receptive. Yet many health departments are short-staffed on risk communicators and these individuals play many roles. They need more support.

I think we also need to see more outreach through social media platforms and news media, and greatly increase the flow of true information through many channels rather than just the public health echo chamber.

How can individuals learn how to identify mis- or disinformation online?

This is a great question. While systemic changes are needed, they should also include ways to help the public become more resilient to false information—so that when people see misinformation online, little of it would penetrate, like water off a duck’s back.

To do this, we should promote health and digital literacy through multiple sources including schools, community organizations, social media, and news media. We should also provide consumers with tools to choose responsible sources of information and increase their awareness of disinformation tactics and approaches.

Here are some basic rules when it comes to identifying and dealing with false information.

If you see false information online

  • Don’t repeat or retweet the lie, even with a correction!
  • If you don’t know the source or know if the source is legitimate, limit direct engagement.
  • Report it to social media companies.
  • Provide true information.

If you need to respond to people who believe false information

  • Engage respectfully.
  • Connect along common values.
  • Talk about tactics and how misinformation draws you in.
  • Discuss alternative explanations.
  • Encourage verification.
  • Provide alternative sources.

Ways to check for false information

  • Use web-based tools and services that can provide unbiased assessment of source credibility.
  • Verify the information with other news sources, trusted people in your network, or cross-referencing with the best information available.
  • Ensure that the source is known, credible, and trusted by taking a close look at the social media account, web URL, or layout that might suggest lack of editorial oversight.
  • Think twice about messages that seem designed to appeal to emotions.
  • Increase awareness of disinformation campaign tactics and personal biases that influence judgment of sources and information, as well as one’s capacity to change opinion when presented with new evidence.

What would it look like for such a diverse group of stakeholders to come together for a national strategy like the one proposed in the report?

I’d like to see collective planning for a strategy from social media, news media, government, national security officials, constitutional scholars, public health officials, scientists, and the public. If we can’t bring all the stakeholders together and move forward in a coordinated manner, this just becomes a whack-a-mole situation.

Have there been similar large scale, cross-sector efforts to address health issues in the past?

This is a bit similar to the national biodefense strategy, although it really requires strong input from the private sector. In today’s world, we have big problems that cross many different domains, and I think it’s a good example of the direction that we need to head for solutions.

 

Tara Kirk Sell, PhD, is a senior scholar at the Johns Hopkins Center for Health Security. She has worked for over a decade on pandemic preparedness and response with a focus on risk communication and management of health misinformation.

RELATED CONTENT