NewsNational

Actions

Facebook reveals its internal rules for removing controversial posts

Facebook reveals its internal rules for removing controversial posts
Posted at 6:35 AM, Apr 24, 2018
and last updated 2018-04-24 06:35:43-04

Facebook is trying to be more transparent about how it decides what content to take down or leave up.

On Tuesday, the company is making public for the first time its detailed internal community standards policies.

The document is what Facebook's 7,500 content moderators use when deciding what is and isn't acceptable content, including hate speech, nudity, gun sales and bullying. A shorter version was previously available online.

Facebook is also adding a way for individuals to appeal when it removes one of their posts because of sexual content, hate speech or violence. Appeals will be reviewed by a moderator within a day, the company promises. Eventually, it will add appeals for more types of content and for people who reported posts that weren't taken down.

Every week, Facebook sifts through millions of reports from users about inappropriate posts, groups or pages. Additional posts are also flagged by Facebook's automated systems. A member of the team of moderators — a combination of full-time and contract employees around the world — reviews each post.

Related: YouTube took down more than 8 million videos in 3 months

The expanded guidelines fill 27 pages and include the reasoning behind each policy, along with detailed examples.

They include the company's full definitions for terrorist organizations and hate groups. Hate speech is divided into three levels, and includes "some protections for immigration status." There's a detailed policy on the sale of marijuana (not allowed, even where it's legal) and firearms (only shown to adults aged 21 or older -- and no sales between individual people). Bullying rules don't apply to comments made about public figures.

The document is filled with striking details about very specific issues. For example, you can't post addresses or images of safe houses, or explicitly expose undercover law enforcement. You can only show victims of cannibalism if there's a warning screen and age requirement. And photos of breasts are allowed if they depict an act of protest.

Related: EU gives tech companies 1 hour to remove terrorist content

Facebook has come under criticism for not being transparent enough about how it decides what is or isn't banned. And it has at times appeared inconsistent in the applications of its own rules.

Most recently, Facebook fought accusations that it censored conservative personalities like Diamond and Silk in the United States. Human rights groups have complained about its handling of hate-filled posts linked to violence in countries like Myanmar.

"Our enforcement isn't perfect. We make mistakes because our processes involve people, and people are not infallible," Monika Bickert, Facebook's head of product policy, said in a blog post Tuesday.

Related: Facebook is offering facial recognition again in Europe

The guidelines are global and will be released in 40 different languages. Facebook says it has detailed local information to help moderators handle the nuances of different locations and languages. It will not make all of its moderator guides public, such as lists of hate-speech words, as releasing them could make it easier for people to game the system.

To keep up with changes in language and behaviors, the guidelines are updated regularly. A policy team meets every two weeks to review potential additions or edits.

"We've promised to do better and we hope that sharing these details will serve as the basis for increased dialogue and input," Bickert said.

The-CNN-Wire
™ & © 2018 Cable News Network, Inc., a Time Warner Company. All rights reserved.