How does Facebook screen content. Well their guidelines leave it up to the screener to decide – PB/TK
Facebook’s Content Screening Guidelines Leaked – By David Z. Morris / May 21 2017
A large trove of leaked internal Facebook documents pulls back the curtain on how the social network’s employees moderate postings on the site.
When it comes to violent images and videos, the moderation team can choose to either take no action, mark content as ‘disturbing,’ or remove it entirely. Broadly, Facebook’s approach to violent imagery is to allow it because of its importance for public awareness. That includes video of violent human deaths, which are flagged as ‘disturbing’ but not, as a rule, removed. The broad exception is any imagery of violence that is shared with “sadism and celebration,” which should be removed, according to the guidelines.
The documents, obtained by The Guardian, include thousands of pages including examples. Their publication follows a series of recent incidents including live murders and deceptive news that have sparked debate about exactly what should be allowed.
The documents show a delicate balancing act behind Facebook’s efforts to manage a deluge of postings from around the globe. The policies include the company’s approach to videos and images of graphic violence, cruelty to animals, and non-sexual child abuse. They also outline how Facebook’s moderation team is expected to screen threats of violence.
Continue to fortune.com article: http://fortune.com/2017/05/21/facebook-content-screening-guidelines-leaked/?xid=homepage