Facebook Files lift the lid on social media giant’s censorship rules

British newspaper the Guardian has got its hand on a tranche of leaked files from Facebook which have lifted the lid on how the social media behemoth decides what content it can and cannot censor.

The leak amounted to more than 100 of Facebook’s training manuals and other documents which guide staff on how to deal with content such as terrorism, pornography, revenge-porn, hate speech, racism, self-harm, and violence.

The Facebook Files, as the Guardian has termed its coverage of the content of the document comes at a time when Facebook has been under pressure in the media for its failure to block content which is alleged to include videos of rape and suicide, often including minors. Along with other social media sites, it has also come under pressure from governments in the UK and elsewhere to clamp down more on terrorist-related content.

Huge time pressures on moderators

One of the crucial revelations from the files is the simple fact that Facebook’s moderators, the people in charge of applying the myriad of rules laid out in these documents, are totally overrun and working in conditions where mistakes are inevitable.

According to the Guardian, they often have as little as 10 seconds to make a decision about whether or not to censor a post. Mistakes can mean that disturbing content which should be taken down is left up, such as the recent case of the suicide of a 12-year-old girl, which was live-streamed and remained uploaded on Facebook for more than 2 weeks.

But there is also another side to this issue, as in such pressured timeframes, there is also inevitably going to be a number of sites and posts which are censored by Facebook when they most likely shouldn’t be.

Confusion and inconsistency

But the leak has also revealed the level of inconsistency there seems to be in the Facebook censorship rules. Many free speech advocates have been horrified at some of the rules which they say amount to Facebook being in effect the world’s largest online censor.

Some content that many people would like to see being removed, such as some violent deaths, some non-sexual child abuse, and some animal abuse content do not have to be removed because according to Facebook ‘they may help raise awareness of the issues they contain’.

It has also revealed that once a user passes 100,000 followers they are considered to be a public figure and therefore not subject to as many protections as private users.

In defending Facebook’s position on censoring the content on its site, Facebook’s head of global policy management, Monika Bickert, ‎told the Guardian “We have a really diverse global community and people are going to have very different ideas about what is okay to share. No matter where you draw the line there are always going to be some grey areas.”

Should Facebook be society’s moral arbitrator?

She is undoubtedly right in the points she makes, but the release of these rules has opened up a wider debate about Freedom of Speech and whether Facebook should be able to censor content under its own rules.

To give them some credit, they do appear to be trying to strike a balance between removing content that most people would agree has no place in society and keeping up content that is important for education or cultural purposes but might nonetheless offend.

But it is an inexact science, with a myriad of competing pressures over what should and should not be permissible. Add in the 10-second timeframe in which moderators have to make a decision and what you end up with is an impossible situation.

Which leaves the question about whether it is even Facebook’s responsibility to censor content on their site at all. And indeed, whether as a society, we are happy to make Facebook make such deep moral and philosophical decisions on our behalf.

Somewhere a decision has to be made about whether individual responsibility ends and corporate responsibility begins. And until that problem is solved satisfactorily, Facebook will continue to be the arbitrator of what is socially acceptable and what should and shouldn’t be censored.

Leave a Reply

Your email address will not be published. Required fields are marked *