Study calls for Facebook to stop outsourcing content moderation

Study also urges Facebook to double number of moderators and hire a content overseer

The NYU Stern Center for Business and Human Rights has released a report that urges Facebook’s to stop outsourcing content moderation.

According to the report, Facebook’s decision to outsource content moderation is the key reason the company’s efforts to moderate the platform are failing.

As a result, the NYU Stern Center for Business and Human Rights has called on Facebook to stop outsourcing content moderation and commit to bringing the work in-house.

According to the report, Facebook users, along with the company’s artificial intelligence system, flag more than three million items to be moderated daily. With Facebook reporting a 10% error rate spread across 20 sites that means the company makes an estimated 300,000 content moderation mistakes per day.

While Facebook’s content moderation problems have been widely reported on, Paul Barrett, the study’s principal author, wanted to highlight that while content moderation is fundamental to Facebook, the company’s choice to outsource the work to underpaid contractors in remote locations is problematic.

Barrett also cites the lack of training moderators receive in processing flagged content as a major issue.

“They never actually teach you to process what you’re seeing,” says Sean Burke, a participant in Barrett’s study. “It’s not normal seeing people getting their heads cut off or children being raped.”

To remedy Facebook’s content-moderation problem, Barrett has called on the social media company to bring its moderation efforts in-house while also providing moderators with proper salaries and benefits.

Other recommendations include doubling the number of moderators, hiring a content overseer, expanding moderation in at-risk countries, providing on-site medical services to moderators and expanding its fact-checking efforts.

In an interview with Venture Beat, Barrett recognized the cost of implementing the suggested measures will likely serve as a major deterrent for the company, though he’s optimistic Facebook will take some steps into consideration.

“It is a very ambitious ask,” Barrett said. “But my attitude is if the current arrangement is inadequate, why not just go for it and urge [the company] to remedy the problem in a big way. I don’t think Mark Zuckerberg is going to [smack himself on the head] and say, ‘Oh my god, I never thought of that!’ But I do think it’s possible the company is ready to move in that direction.”

Featured Resources

Consumer choice and the payment experience

A software provider's guide to getting, growing, and keeping customers

Download now

Prevent fraud and phishing attacks with DMARC

How to use domain-based message authentication, reporting, and conformance for email security

Download now

Business in the new economy landscape

How we coped with 2020 and looking ahead to a brighter 2021

Download now

How to increase cyber resilience within your organisation

Cyber resilience for dummies

Download now

Recommended

Feds rescind prohibited transactions list for TikTok and WeChat
social media

Feds rescind prohibited transactions list for TikTok and WeChat

21 Jun 2021
President Biden repeals and replaces Trump’s TikTok executive order
social media

President Biden repeals and replaces Trump’s TikTok executive order

9 Jun 2021
How to reduce your online footprint
privacy

How to reduce your online footprint

7 Jun 2021
TikTok implies it’s collecting users' faceprints and voiceprints
privacy

TikTok implies it’s collecting users' faceprints and voiceprints

4 Jun 2021

Most Popular

How to find RAM speed, size and type
Laptops

How to find RAM speed, size and type

16 Jun 2021
What is HTTP error 400 and how do you fix it?
Network & Internet

What is HTTP error 400 and how do you fix it?

16 Jun 2021
Ten-year-old iOS 4 recreated as an iPhone app
iOS

Ten-year-old iOS 4 recreated as an iPhone app

10 Jun 2021