Facebook makes its community guidelines public and introduces an appeals process
Last May, The Guardian published a leaked copy of Facebook’s content moderation guidelines, which describe the company’s policies for determining whether posts should be removed from the service. Almost a year later, Facebook is making an expanded set of those guidelines available to the public, a move designed to gather input from users around the world. The company is also introducing a new appeals process, allowing users to request a review if they believe their post has been removed unfairly.
The community standards run 27 pages and cover topics including bullying, violent threats, self-harm, and nudity, among many other topics. “These are issues in the real world,” said Monika Bickert, head of global policy management at Facebook, in an interview with reporters. “The community we have using Facebook and other large social media mirrors the community we have in the real world. So we’re realistic about that. The vast majority of people who come to Facebook come for very good reasons. But we know there will always be people who will try to post abusive content or engage in abusive behavior. This is our way of saying these things are not tolerated. Report them to us, and we’ll remove them.”
The guidelines apply to every country in which Facebook operates, and have been translated into more than 40 languages. The company says it developed them in conjunction with a “couple hundred” of experts and advocacy groups representing the entire world. As the guidelines evolve — and they will evolve, Bickert said — they will be updated simultaneously in every language.
The guidelines mostly apply to other Facebook services, including Instagram, although there are differences. (You don’t have to use your real name on Instagram, for one.) The underlying policies haven’t changed, Bickert said, though they now include extra guidance on making decisions. “What’s changing is the level of explanation about how we apply those policies,” Bickert said.
Amid a series of unfolding humanitarian crises, Facebook has been under pressure to improve content moderation around the globe. In March, the United Nations blamed Facebook for spreading hatred of the Rohingya minority. Facebook was also forced to temporarily shut down its services in Sri Lanka last month after inflammatory messages posted to the service incited mob violence against the country’s Muslim minority. This weekend, a report in The New York Times connected hate speech on Facebook to murders in Indonesia, India, and Mexico.
In response, the company has said it will double its 10,000-person safety and security teamby the end of this year. It also plans to update the guidelines regularly as new threats emerge. Facebook is making the guidelines public now because it hopes to learn from users’ feedback, Bickert said.
“I think we’re going to learn from that feedback,” she said. “This is not a self-congratulatory exercise. This is an exercise of saying, here’s where we draw the lines, and we understand that people in the world may see these issues differently. And we want to hear about that, so we can build that into our process.”
Facebook also announced plans to develop a more robust process for appealing takedowns that were made in error. The company has faced regular criticism for high-profile takedowns over the years, whether it’s over a picture of a woman breastfeeding her child or an iconic wartime photo.
Now users will be able to request that the company review takedowns of content they posted personally. If your post is taken down, you’ll be notified on Facebook with an option to “request review.” Facebook will review your request within 24 hours, it says, and if it decides it has made a mistake, it will restore the post and notify you. By the end of this year, if you have reported a post but been told it does not violate the community standards, you’ll be able to request a review for that as well.
ليست هناك تعليقات