Toggle Menu
  1. Home/
  2. Tech & Science/
  3. IT&C/

Facebook moderation policy on sex, terrorism and violence, leaked

A Guardian investigation has revealed for the first time Facebook’s secret rules and guidelines used to decide what users can post on the social media site. Guardian journalists have obtained more than 100 internal training manuals, spreadsheets and flowcharts that show how Facebook moderates issues like violence, hate speech, terrorism, pornography and racism.

The documents also reveal that Facebook allows its users to live stream self-harm videos. There are also guidelines regarding match-fixing and cannibalism.

Here is a list of some Facebook rules and guidelines revealed by the Guardian:

loading...

“We do not action photos of child abuse. We mark as disturbing videos of child abuse. We remove imagery of child abuse if shared with sadism and celebration.”

“Facebook does not automatically delete evidence of non-sexual child abuse to allow the material to be shared so the child [can] be identified and rescued, but we add protections to shield the audience”.

“We allow photos and videos documenting animal abuse for awareness, but may add viewer protections to some content that is perceived as extremely disturbing by the audience. Generally, imagery of animal abuse can be shared on the site. Some extremely disturbing imagery may be marked as disturbing.”

“We allow people to share images of animal abuse to raise awareness and condemn the abuse but remove content that celebrates cruelty against animals.”

“Videos of violent deaths are disturbing but can help create awareness. For videos, we think minors need protection and adults need a choice. We mark as ‘disturbing’ videos of the violent deaths of humans.”

Joanna Lewis

Loading...