How exactly do Facebook's moderators decide what is and is not appropriate to keep on the site these days? The Guardian knows and they just made it all public.
100 internal training manuals, spreadsheets and flowcharts examined as part of a Guardian investigation into Facebook's moderation policies reveals the company's rule book for what stays on the social media site - and what goes. We covered a similar story in 2016 when a German publication leaked an old set of Facebook guideline slides for third-party moderators wherein we discovered that celebrities peeing is fair FB game.
Basically, anything that gets flagged on Facebook go to one of approximately 7,500 Facebook moderators who then consult this massive manual of what's allowed. And what's allowed is pretty weird and complicated. The Guardian's investigation is thorough so it's easy to go down the rabbit hole of categories within categories of rules; but here are a few notable ones:
Moderators must delete comments like "Someone shoot Trump," because as a head of state he is in a protected category. However posts like "to snap a bitch's neck, make sure to apply all your pressure to the middle of her throat," or "kick a person with red hair" are no prob for FB.
Anyone with more than 100,000 followers is considered a public figure which means they don't get the full protections given to private individuals. (See also: celebrity pee.)
Some photos of non-sexual physical abuse and bullying of children are allowed unless it is shared without "sadism and celebration." Child sexual abuse is never allowed.
Generally, photos of animal abuse are okay. Says Facebook, "We allow people to share images of animal abuse to raise awareness and condemn the abuse but remove content that celebrates cruelty against animals." (I think this means that if PETA wants to show you animal abuse, it's okay but if a sicko animal abuser posts a video, it's forbidden.)
Videos of abortions are allowed as long as no one is naked.
Live-streamed attempts at self-harm are allowed because Facebook "doesn't want to censor or punish people in distress" and also, "Removing self-harm content from the site may hinder users' ability to get real-world help from their real-life communities."
Videos of violent deaths, while marked as disturbing, don't always have to be deleted because "they can help create awareness of issues such as mental illness." Videos celebrating death, for example "enjoying the justice" of public executions, are not allowed.
This is a baffling report. I keep re-reading bits to check that I've misunderstood but NOPE my comprehension's sound https://t.co/iKTIngmtUz
— Smashleigh Bunnikins (@Glitter_brawl) May 22, 2017
Naturally, there are strong arguments on both sides of Facebook's content moderation rules.
Eve Critchley, head of digital at the British mental health charity Mind told TechCrunch that she thinks Facebook needs to do more in terms of publishing self-harm and suicide. "We don't yet know the long-term implications of sharing such material on social media platforms for the public and particularly for vulnerable people who may be struggling with their own mental health. What we do know is that there is lots of evidence showing that graphic depictions of such behaviour in the media can be very harmful to viewers and potentially lead to imitative behaviour. As such we feel that social media should not provide a platform to broadcast content of people hurting themselves."
TechCrunch's Natasha Lomas notes, "It would be relatively easy for Facebook to ban all imagery showing animal cruelty, for example but such a position is apparently 'too safe' for Facebook. Or rather too limiting of its ambition to be the global platform for sharing. And every video of a kicked dog is, after all, a piece of content for Facebook to monetize."
Facebook stands by its rules even after two highly publicized incidents of live-streamed murder and suicide. One killing took place in April in Cleveland, and another involved a father killing his child in Thailand both broadcast on FB. In the latter case, it took 24 hours for Facebook to remove the video.
"We work with publishers and other experts to help us understand what are those moments (that warrant publication.) For example, on 11 September 2001, bystanders shared videos of the people who jumped from the twin towers. Had those been livestreamed on Facebook that might have been a moment in which we would not have removed the content both during and after the broadcast," explained Monika Bickert, Facebook’s head of global policy management to the Guardian. "We more recently decided to allow a video depicting an Egyptian man who set himself on fire to protest rising government prices."
Monika Bickert, head of global policy management at Facebook, released a statement saying, "Keeping people on Facebook safe is the most important thing we do."
Phew.
Related: Leaked Facebook Moderation Doc Says Photo Of Fergie Peeing Is OK, Mocking The French Not OK