The worlds-biggest social network Facebook for the first time in its history published the internal policy rules that employees should follow when deleting posts or blocking users.
The twenty-seven page document is broken up into six sections that are dealing with the topics that Facebook staff has to face on a daily basis: Violence and Criminal Behavior, Safety, Objectionable Content, Integrity and Authenticity, Respecting Intellectual Property and Content-Related Requests.
The company’s VP of Global Product Management Monika Bickert said in a release that the company staff stationed in eleven offices worldwide includes experts on terrorism, child safety and hate speech. And have relevant back ground on these subjects before getting jobs with Facebook.
“I worked on everything from child safety to counter terrorism during my years as a criminal prosecutor, and other team members include a former rape crisis counselor, an academic who has spent her career studying hate organizations, a human rights lawyer, and a teacher.”
The reasons that may lead to the removal of your post on Facebook may include but are not limited to plausible threats, support for terrorism, recognition of the commission of a user or his associates of crimes, the placement of images of a naked child’s body and explicit images of sexual intercourse, calls for harassment and the publication of personal information of others.
The content that is subject to removal from Facebook does not necessarily has to be illegal. Facebook says it will frown at “attempts by individuals, manufacturers and retailers to purchase, sell or trade non-medical drugs, pharmaceutical drugs and marijuana” no matter if its legalized in your parts or not.
Bickert said the new guidelines have been developed with “input from our community” the help of experts in areas as diverse as IT and public safety.