People ‘feel safe’ using violent language on Facebook

The company admits that a lack of empathy on social media leads to abusive language

Facebook has admitted that people feel at ease using violent language on its platform, after new documents revealed the categories of content that Facebook does and does not find acceptable.

An investigation by The Guardian has exposed the company's standards for removing offensive material, including the fact that images of animal and child abuse do not have to be deleted unless the context is overtly sadistic.

Advertisement - Article continues below

The report has raised questions about how people behave on the platform, and what level of responsibility - if any - Facebook has to police the content they post.

According to internal training documents seen by The Guardian, Facebook's users feel safe using violent, threatening language to express their frustrations online. "People commonly express disdain or disagreement by threatening or calling for violence in generally facetious and unserious ways," it read. "They feel that the issue won't come back to them and they feel indifferent towards the person they are making the threats about because of the lack of empathy created by communication via devices as opposed to face to face."

According to Facebook, violent imagery and threats do not become serious enough to remove until they transition from "an expression of emotion" to "a plot or design". For example, statements like 'someone shoot Trump' would be eligible for deletion; 'let's beat up fat kids' would not.

Advertisement
Advertisement - Article continues below
Advertisement - Article continues below

Online abuse and harassment has become a persistent problem over the past few years, particularly on social media platforms like Facebook and Twitter, both of which have been faced with calls to crack down on hate speech and bullying. Twitter has responded with measures such as improved reporting tools and the retirement of the default 'egg' display picture.

"We feel responsible to our community to keep them safe and we feel very accountable. It's absolutely our responsibility to keep on top of it," Facebook's head of global policy management Monika Bickert told The Guardian. "It's a company commitment."

Featured Resources

Preparing for long-term remote working after COVID-19

Learn how to safely and securely enable your remote workforce

Download now

Cloud vs on-premise storage: What’s right for you?

Key considerations driving document storage decisions for businesses

Download now

Staying ahead of the game in the world of data

Create successful marketing campaigns by understanding your customers better

Download now

Transforming productivity

Solutions that facilitate work at full speed

Download now
Advertisement

Recommended

Visit/marketing-comms/social-media/356361/stop-hate-for-profit-calls-on-facebook-to-address-hate-on-the
social media

Stop Hate for Profit calls on Facebook to address hate on the platform

6 Jul 2020
Visit/security/privacy/356321/facebook-flaw-gave-5000-developers-access-to-personal-data
privacy

Facebook flaw gave 5,000 developers access to users' data

2 Jul 2020
Visit/marketing-comms/social-media/356307/facebook-bans-violent-boogaloo-network
social media

Facebook bans “violent” boogaloo network

1 Jul 2020
Visit/marketing-comms/social-media/356275/facebook-ad-boycott-is-ready-to-go-global
social media

Facebook ad boycott is ready to go global

29 Jun 2020

Most Popular

Visit/laptops/29190/how-to-find-ram-speed-size-and-type
Laptops

How to find RAM speed, size and type

24 Jun 2020
Visit/business/policy-legislation/356256/uk-invested-about-ps500m-in-wrong-gps-satellites
Policy & legislation

UK gov buys "wrong" satellites in £500m blunder

29 Jun 2020
Visit/mobile/5g/356349/uk-to-remove-huawei-from-5g-networks-imminently
5G

UK to ban Huawei from 5G networks 'within weeks'

6 Jul 2020