Facebook’s new ‘community standards’ have leaked, and have a whole lot of friends … not everyone agreed with them

In case you were wondering whether sharing a piece of Facebook content might affect its ongoing re-platforming policy — this is just one entry in a range of information that made its way into…

Facebook’s new ‘community standards’ have leaked, and have a whole lot of friends … not everyone agreed with them

In case you were wondering whether sharing a piece of Facebook content might affect its ongoing re-platforming policy — this is just one entry in a range of information that made its way into a copy of a new Facebook “community standards” document published Wednesday by a German activist. The document appears to have appeared online Wednesday as the company undergoes a public relations crisis over how it treats different kinds of content, and in a public survey conducted just last month, most of those surveyed agreed with policies that seemed an out of step with social media’s embrace of collaboration and cross-pollination.

According to various news outlets, the new document is one of several that now make up the company’s Community Standards Policy — a document used by the company to communicate its various policies to users and the third-party network they use. Not everybody who submitted their opinions to the survey received a printout of it, however, and when Deutsche Welle — a Germany news network — asked for a scan of the survey document, the company conceded that the email that administrators allegedly sent it had been leaked.

In an investigation into the leak, journalists at the German news network were able to obtain some of the documentation. On Thursday, the company posted a summary of the document and a statement from founder and CEO Mark Zuckerberg asking that he be thanked for his initiative and asking German President Frank-Walter Steinmeier to comment on the incident as he would “any other concerned citizen.” In its statement, Facebook said:

We are a public company, and we make this information available to our community whenever it’s requested. We believe the information to be legitimate, public and reflective of the community standards set by Facebook around the world.

The document is said to contain data that goes back to 2013. It’s not immediately clear what Facebook had found worthy of the company sifting through the survey, but what is clear is that it has had multiple instances where members of the community disapproved of the company’s behaviors and responses when presented with information.

The document makes it clear where these controversial decisions came from, with screenshots showing that “main content” is prioritized over “particular content” in the community standards. For example, under the section on graphic violence, it says: “The fact that what is depicted was digitally created makes it more difficult to assess whether what is depicted is intentionally harmful or harmful to others. Therefore, we do not allow graphic violence on Facebook.” However, it then instructs moderators to “For content that is graphic in nature: Adopt the following types of viewing decisions: Avoid in certain circumstances (e.g. graphic images of dead bodies, open wounds, wounds that show the wound’s complexity, simple pictures of guts,” and “Accept for example of graphic violence that is relatively uncontroversial or is an isolated case where users are posting contextually relevant content about the trauma of an individual, the violent events that took place, or the relationship between people who are looking to turn their lives around.” This appears to suggest that the company has to cater to users and social media users in certain parts of the world in certain cases, depending on where they are.

Facebook, of course, has endured a summer of crises so far in 2018, most notably an ongoing public firestorm over the harvesting of personal data from tens of millions of users and its management’s somewhat questionable response to the issue. Moreover, while the company continues to monetize on the trust it built with its 1.23 billion monthly users and their interests, the one positive news for Facebook, aside from continued public pressure, is that it has failed to fundamentally sway the outcome of its negative publicity. If anything, Facebook has helped consumers weigh the costs of participating in the site and used a company summit this week to apply further pressure on its users. “[Facebook] has failed to be a government watchdog, and the company has bled data,” The Washington Post quotes Amnesty International’s New York director as saying at the meeting. “Trust and freedom of expression will not return until this is resolved.”

Read the full story at Deutsche Welle.

Related

Woman shares what going topless on Facebook meant to her — and thinks she could have saved her a lot of trouble

Under fire, Facebook’s Zuckerberg shows up for Helsinki town hall after public protests

Leave a Comment