Twitter tightens regulations on coronavirus misinformation

Platform will require users to remove tweets including unverified claims

As coronavirus misinformation continues to spread across social media, Twitter has announced it’s expanded the types of tweets it’ll require people to remove. Now, Twitter will require users to delete content that includes “Unverified claims that incite people to action, could lead to the destruction or damage of critical infrastructure, or could lead to widespread panic, social unrest, or large-scale disorder.”

Advertisement - Article continues below

Twitter gave an example of a misinformed post it would require the user to remove: “The National Guard just announced that no more shipments of food will be arriving for two months — run to the grocery store ASAP and buy everything!”

Unfortunately, unverified claims have already led to the destruction of British 5G towers. After conspiracy theories falsely linked the spread of COVID-19 to the 5G rollout, ill-informed theorists set fire to the towers. The theory claimed the coronavirus spread through Wuhan due to the city’s recent 5G rollout. Theorists further claimed that the spread of the virus can be traced to other cities that are also using 5G.

According to an April 22 tweet, Twitter has already removed upward of 2,230 misleading tweets since introducing policies related to COVID-19 content on March 18, 2020. Early policies stated that Twitter would require users to remove tweets including content that could increase the chance of someone contracting or transmitting the novel coronavirus. 

Advertisement
Advertisement - Article continues below
Advertisement - Article continues below

Twitter isn’t the only company taking aim at coronavirus misinformation either. 

Facebook has also announced changes to its platform, all of which are geared toward curbing misinformation related to the coronavirus. Not only is the company working with over over 60 fact-checking organizations, but it also recently began releasing funds from a $1 million grant program in partnership with the International Fact-Checking Network. 

Beyond its fact-checking efforts, Facebook has also begun notifying users via News Feed if they have liked, reacted or commented on misinformation related to COVID-19 that it’s since removed from the platform.

Featured Resources

Staying ahead of the game in the world of data

Create successful marketing campaigns by understanding your customers better

Download now

Remote working 2020: Advantages and challenges

Discover how to overcome remote working challenges

Download now

Keep your data available with snapshot technology

Synology’s solution to your data protection problem

Download now

After the lockdown - reinventing the way your business works

Your guide to ensuring business continuity, no matter the crisis

Download now
Advertisement

Recommended

WhatsApp counters fake viral messages with new fact-check tool
communications

WhatsApp counters fake viral messages with new fact-check tool

4 Aug 2020
ByteDance investors value TikTok at $50 billion in acquisition bid
social media

ByteDance investors value TikTok at $50 billion in acquisition bid

29 Jul 2020
Michael Seibel to replace Alexis Ohanian on Reddit board of directors
chief executive officer (CEO)

Michael Seibel to replace Alexis Ohanian on Reddit board of directors

10 Jun 2020
Facebook launches Messenger app for desktop
facebook at work

Facebook launches Messenger app for desktop

3 Apr 2020

Most Popular

How to find RAM speed, size and type
Laptops

How to find RAM speed, size and type

3 Aug 2020
How to use Chromecast without Wi-Fi
Mobile

How to use Chromecast without Wi-Fi

4 Aug 2020
How do you build a great customer experience?
Sponsored

How do you build a great customer experience?

20 Jul 2020