Facebook unfolds secret rules and guidelines. The policies called community standards are unclear and inconsistently enforced compelled Facebook to open up about how these decisions get made.
“We think the new community standards are going to give people the knowledge they need to say: ‘We think you applied the policy incorrectly,'” said Monika Bickert, vice president of global product management. “This is going to be a way to give people a real voice in this process.”
It will ask user to have second look on your piece of content, if it relates to hate speech or violence and any other. If your photo, video or post is removed for violating Facebook’s rules, you will be given the option to “Request Review.”
The rules are interpreted differently from country to country depending on the language, culture and mores of Facebook users. Appeals will be conducted by a “community operations” team within 24 hours. If Facebook determines it made a mistake removing content, it will be restored.
The moves announced Tuesday are the latest in a series of efforts to restore public trust after 87 million people had their data taken without their consent by Cambridge Analytica, a British political firm with ties to Donald Trump’s presidential campaign.
Facebook has developed dozens of rules to draw the line between what should and shouldn’t be allowed on the platform. This is to make Facebook a safer and less toxic place for its 2.2 billion users. It is going to use AI tools that can identify certain bad activities