Facebook is looking to regain the public’s trust with an explanation on what is accepted on the site and what is considered inflammatory, and why certain posts end up being deleted. They have posted their Community Standards before, but these expanded, internal procedural details come in wake of the Cambridge Analytica scandal.
In a lengthy post, Facebook clarified how they would be updating the already existing guidelines, which for the most part, have been a grey area. The post detailed the struggles they faced on “identifying potential violations” of its content guidelines, which has resulted in some hate speech and degrading images left public, while other content of the same nature was taken down.
They stated that two groups are responsible for picking up on hateful messages — artificial intelligence and users flagging certain content they may find inappropriate.
The issue with the previous methods was the lack of a clear understanding of what is acceptable and when to flag a post. With so many false flags, Facebook has implemented an appeals process to appease its critics.
The appeals, which will be available later in 2018, can be implemented on a post that contains nudity, hate speech or extremely graphic content. If your post is indeed flagged, you will have a window of time to appeal the flagged content.
Within 24 hours of your appeal, your content will be manually reviewed by a Facebook employee. Also, the social media giant noted that content creators and users can give additional context to a certain flagged item, in the hopes of giving the Facebook employee in charge of appeals more information to make an accurate decision.
Earlier last month, Facebook’s CEO Mark Zuckerberg said: “We won’t prevent all mistakes or abuse, but we currently make too many errors enforcing our policies and preventing misuse of our tools.”
With these newly updated guidelines, it’s safe to say he is one step closer to fixing issues that have long plagued the company. What are your thoughts on the upcoming changes?