ADVERTISEMENT
ADVERTISEMENT

Surprising Takeaways From Facebook's Community Standards

Photographed by Erin Yamagata.
Today's Facebook news is not that the social network is changing its community standards, the set of policies that govern what users can and can't post. It's that — for the first time — Facebook is releasing a much more detailed version of its standards to the public.
These standards are provided to the company's 7,500 community operations employees, who are in charge of reviewing content that is reported or flagged as a potential violation. Facebook's moderators decide whether to leave those posts up or take them down based on the standards. The policies address everything from posts with instructions for making explosives (which is a violation, "unless there is clear context that the content is for a non-violent purpose") to posts showing uncovered female nipples (also a violation "except in the context of breastfeeding, birth giving, and after-birth moments, health, or an act of protest ").
AdvertisementADVERTISEMENT
If you read The Guardian's exposé on Facebook's leaked content moderation policies last year, some of the information made public today will look familiar. However, this is the first time Facebook is releasing its standards in full — and making a promise to continue updating them publicly as changes are made internally. Additionally, Facebook announced a new appeals process, that will allow you to ask for a second look at any posts removed for violating the company's nudity, hate speech, or graphic violence standards.
Monika Bickert, Facebook's VP of Product Management says the company has been working on providing greater transparency to its standards since 2017. Still, the timing of today's release coincides with Facebook's recent efforts to regain user trust post-Cambridge Analytica.
Ahead, some of the most interesting and bizarre takeaways from the policies that govern what Facebook does and doesn't allow.
There Are Some Strange Allowances For Content Featuring Public Figures
Some of the most important policies, including ones relating to bullying, do not apply to public figures. Facebook cites open conversation as a reason, saying, "We want to allow discourse, which often includes critical discussion of people who are featured in the news or who have a large public audience."
Public figures come up again in the section on adult nudity. Posting "images of real nude adults, where nudity is defined as visible anus and/or fully nude close-ups of buttocks" is a violation, "unless photoshopped on a public figure."
There Are Mentions Of Fake News
AdvertisementADVERTISEMENT
Surprisingly, Facebook does not remove fake news. The company again references freedom of speech as the reason, saying, "We want to help people stay informed without stifling productive public discourse. There is also a fine line between false news and satire or opinion."
As an attempt to curb fake news, Facebook will instead "significantly reduce its distribution by showing it lower in News Feed."
What does this mean for election interference? There is currently only one specific mention of elections in the standards: You cannot post "any content containing statements of intent, calls for action, or advocating for violence due to the outcome of an election."
Posts About Crisis Actors Are Forbidden
One inclusion under the "Harassment" section feels especially relevant post-Parkland: You cannot post things that "target victims or survivors of violent tragedies by name or by image, with claims that a person is lying about being a victim of a event; acting/pretending to be a victim of a event; or otherwise paid or employed to mislead people about their role in the event."
A Lot Of Disturbing Content Remains, So Long As It Has A Warning Screen
Under the "Graphic Violence" section of the policies, Facebook explains it does not allow "imagery of violence committed against real people or animals with comments or captions that contain enjoyment of suffering, enjoyment of humiliation, erotic response to suffering, remarks that speak positively of the violence, or remarks indicating the poster is sharing footage for sensational viewing pleasure."
However, if those comments or captions are not included, there is a lot of imagery — everything from photos showing dismembered animals to videos that show child abuse — that is not removed. That content is covered with a warning screen and limited to adults:
AdvertisementADVERTISEMENT
"We allow graphic content (with some limitations) to help people raise awareness about issues. We know people value the ability to discuss important issues like human rights abuses or acts of terrorism. We also know that people have different sensitivities with regard to graphic and violent content."
You can read the full version of Facebook's updated Community Standards online.
Related Stories:

More from Tech

R29 Original Series

AdvertisementADVERTISEMENT
ADVERTISEMENT