All you need to know about Facebook’s content removal policies

By Elton Gomes

After Facebook CEO Mark Zuckerberg testified before the Congress, the social media company has released its policies regarding content moderation and removal.

In an attempt to maintain transparency over its functionality, Facebook for the first time shared with the general public how information is handled and controlled by the social media website.

“Our Community Standards, which we will continue to develop over time, serve as a guide for how to communicate on Facebook. It is in this spirit that we ask members of the Facebook community to follow these guidelines,” the company said in a statement.

Facebook revealed that to moderate content on its website, it relies on a combination of more than 7,500 human moderators working in 40 languages around the world. The human moderators work alongside artificial intelligence (AI) tools to eradicate malicious content.

The company’s policies provide users with a glimpse of how it defines violent threats, hate speech, sexual exploitation, and other harmful content. In addition, it also explains the rationale behind the policies.

Screen grab of the Facebook post detailing the policy changes.

What is permissible and objectionable content according to Facebook

Facebook’s policies are meant to help people understand where the company draws the line on certain grey issues according to Monika Bickert, vice president of global policy management. The social media company will allow people a right to appeal its decisions.

As per Facebook’s policies, fully nude close-ups of buttocks are not allowed, but they will be permitted if they are photoshopped onto a public figure. Content from hacked sources will not be accepted, but an exception will be made “in limited cases of newsworthiness”.

In terms of policing hate speech, Facebook will no longer disqualify minorities from being shielded from hate speech. Furthermore, Facebook will now notify users when their nudity, sexual activity, hate speech, or graphic violence content is removed and will allow them to hit a button to “Request Review” – which is estimated to happen within 24 hours. 

To give users a better experience of how the company works, Facebook will now be holding Facebook Forums: Community Standards events in Germany, France, UK, India, Singapore, and the US. In terms of curbing graphic violence, images of cannibalism and “visible internal organs” will be flagged for objectionable content.

Facebook profiles of mass murderers will be taken down if “they’ve killed four people at once, as defined by whether they were convicted or identified by law enforcement with images from the crime, or whether they took their own life or were killed at the scene or aftermath.”

Nude pictures of children will be removed by the social media website – even in those instances where the child’s parents post the pictures, “We know that sometimes people share nude images of their own children with good intentions; however, we generally remove these images because of the potential for abuse by others and to help avoid the possibility of other people reusing or misappropriating the images.”

According to Facebook’s guidelines, if an individual states that a victim of a tragedy is a liar, or is being paid to lie, this behaviour will be considered as a form of harassment.

Among a host of several other policies, Facebook will not tolerate anyone being insensitive towards a person’s vulnerabilities; it will not allow anyone buy or sell marijuana, or pharmaceutical drugs on the website, and it won’t advocate calls for violence after an election.

Why is this important

A report in Tech Crunch raises questions as to whether Facebook’s guidelines act as editorial policy. If they do, shouldn’t Facebook be classified as a media company that exercises editorial control over its content?

Facebook’s new policies might have helped it regain some trust; what remains to be seen is how effective the policies really are in curbing violence and hate speech.

Content PolicyFacebook