Facebook has updated its community standards page to provide the public with more comprehensive information on what posts, images and additional content it permits on its site. This includes guidelines on why it may take down or restrict posts that feature sexual violence and exploitation, hate speech, criminal activity, self-injury or bullying.
These restrictions extend to digitally-created content, unless specifically posted for educational or satirical purposes. Similarly, text-based descriptions of such acts that contain vivid detail are prohibited by the site. The firm’s reorganized community standards now include a separate section on dangerous organizations, which explains why the site bans all content and activity that supports them.
The new guide, which is nearly three times the length of the previous one, has been drafted in an effort to offer clarity to users who complain about others’ posts. The California-based company said that the regulation of content on the site is consistent with how it has applied its standards in the past, and that the policies themselves remain unchanged.
The new guide is 3 times the size of the old one and clarifies a lot of previous grey areas
What has changed, however, is the guidance with which users may better understand its community standards. The rewritten guide is intended to address confusion about why some content takedown requests from users are rejected.
Facebook finds it a challenge to adhere to one set of standards and policies for the entire online social community, of which 80 percent is outside the US and Canada. The firm said that it understands people from different societies may have different thoughts about what is appropriate to share on the web.
However, posts that may seem inappropriate to a particular group of users may not necessarily be in violation of the site’s community standards. As a result, the social network restricts content in countries where it violates local laws, even if that content doesn’t violate its standards.