Facebook on Tuesday released a new set of rules to control the type of posts users are allowed to publish on the platform.

The move gives far more detail than ever before on what is permitted on subjects ranging from drug use and sex work to bullying, hate speech and inciting violence.

Facebook for years has had “community standards” for what people can post.

But only a relatively brief and general version was publicly available, while it had a far more detailed internal document to decide when individual posts or accounts should be removed.

Now, the company is providing the longer document on its website to clear up the confusion.

The company wants to be more open about its operations, said Monika Bickert, Facebook’s vice-president of product policy and counter-terrorism.

“You should, when you come to Facebook, understand where we draw these lines and what’s OK and what’s not OK,” Bickert told reporters in a briefing at Facebook’s headquarters.

Facebook has faced fierce criticism from governments and rights groups in many countries for failing to do enough to stem hate speech.

The company has also been accused of doing the bidding of repressive regimes by aggressively removing content that crosses governments.

It is said to have proved too little information on why certain posts and accounts are removed.

New policies will, for the first time, allow people to appeal a decision to take down an individual piece of content.

Previously, only the removal of accounts, groups and pages could be appealed.

Facebook is also beginning to provide the specific reason why content is being taken down for a wider variety of situations.



Copyright 2020 TheCable. All rights reserved. This material, and other digital content on this website, may not be reproduced, published, broadcast, rewritten or redistributed in whole or in part without prior express written permission from TheCable.

Follow us on twitter @Thecablestyle