Facebook Clarifies Offensive Content Policies
- By Matt Holden
- Mar 16, 2015
Facebook has published a blog post outlining its philosophy regarding inappropriate and offensive content.
Facebook will still rely on users to report offensive or inappropriate posts, but now they will have more guidance on what the company considers a violation of its standards.
Any flagged content is reviewed by a moderator regardless of whether one person reports it or 100. The content gets removed if it’s in violation of the standards or a law.
“In order to treat people fairly and respond to reports quickly, it is essential that we have policies in place that our global teams can apply uniformly and easily when reviewing content,” Facebook writes. “As a result, our policies can sometimes be more blunt than we would like and restrict content shared for legitimate purposes. We are always working to get better at evaluating this content and enforcing our standards.”
Attacking people based on gender, race and other traits isn’t allowed, but hate speech can be linked to or re-posted for the purpose of raising awareness.
Facebook also wants people to sign up using their real identity.
“When people stand behind their opinions and actions with their authentic name and reputation, our community is more accountable,” said the post.
About the Author
Matt Holden is an Associate Content Editor for 1105 Media, Inc. He received his MFA and BA in journalism from Ball State University in Muncie, Indiana. He currently writes and edits for Occupational Health & Safety magazine, and Security Today.