For the first time ever, Facebook has published their internal guidelines for moderation. Facebook has been in the hot seat for the past year, and when CEO Mark Zuckerberg testified before the US Senate earlier this month many of the questions were regarding content removal policies. The release of the internal guidelines moderators use is another step they’re taking towards being more transparent.
If you’re not in the mood to read the 27-page document, here’s what you need to know from it.
Facebook disclosed that it relies on a combination of human and AI moderators to review and remove harmful content. There are more than 7,500 human moderators currently working at Facebook. This is 40% more human reviewers than the company had a year ago. Right now they’re working in 40 languages around the world.
The company has come under fire for not having as strict moderating standards in non-English speaking parts of the world. Facebook received criticism for their poor policing of content related to the ethnic cleansing of the Muslim Rohingya in Myanmar. Working in more languages means Facebook can review content more effectively in more areas of the world. This is a big issue, especially since in many parts of the world Facebook is essentially the Internet much in the same way that WeChat dominates Internet usage in China.
The document finally offers users the company’s definition of things like violence threats, hate speech, sexual exploitation, and other types of harmful content. They also provide background information to explain the rationale behind the policies.
Monika Bickert, Vice President of Facebook’s Global Police Mangement said, “One of the questions we’re asked most often is how we decide what’s allowed on Facebook. These decisions are among the most important we make because they’re central to ensuring that Facebook is both a safe place and a place to freely discuss different points of view.”
Facebook stated that they released the guidelines for two reasons. The first reason is to help people understand where the line is on certain issues. The second reason is that Facebook wants feedback from users and experts in different industries on their guidelines so they can be improved.
The Review Process
In addition to releasing their internal moderation guidelines, the company also announced they would be updating their review process. In the coming year, users will now be able to appeal decisions made on posts that were removed. An appeal will trigger a review by a human moderator at Facebook’s community team. If they determine a mistake was made, they will notify the user and restore the post. The review process will also be extended to enable appeals for content that was reported by left up.
Bickert said that users will “have the option to say, ‘Look at this again’. We’re hoping that the new community standards will provide people with the tools they need to make that decision to appeal.”
In an effort to solicit more direct feedback, Facebook also announced they will host a series of public summits this summer. These summits, called “Facebook Forums: Community Standards” will be held in Germany, France, The UK, India, Singapore, the US, and other countries.
Facebook stated it will update the moderation guidelines it gives to reviewers every week or two moving forward. These updates will reflect feedback given by users and industry experts. These changes will also be in the public version of the guidelines too.
According to Bickert, executives at Facebook have wanted to share the internal guidelines for a while. The delay was due to them trying to organize the guidelines and translate them into a way users could clearly understand.
Although Facebook is moving towards more transparency, there is still a lack of trust in the tech giant. While users have fled like many predicted, there is still a certain air of misgiving and suspicion. Having the guidelines clearly stated is a start, but now that the company has laid out its rules plainly they have the responsibility of following through with actions.