Home / U.S. / Facebook updates community rules, expands appeals process: NPR

Facebook updates community rules, expands appeals process: NPR

Call center employees work for Facebook's Community Operations Team in Essen, Germany, on November 23. Workers review Facebook content and delete it if it does not meet company standards.

Martin Meissner / AP


hide caption

caption

Martin Meissner / AP

Call center employees work for Facebook's Community Operations Team in Essen, Germany, on November 23. Workers review Facebook content and delete it if it does not meet company standards.

Martin Meissner / AP

Facebook announced changes to its content review policy on Tuesday, adding an appeals process for deleted content and publishing the internal guidelines on which it depends to make content determinations.

While the social media giant lists a set of community standards publicly available for several years, the latest update includes the more detailed guidelines used internally by content reviewers when deciding whether to allow or remove publications.

The updated appeals process will allow the posters of photos, videos or deleted publications to contest determinations that they consider wrongly made. Previously, appeals of community standards determinations were only allowed when a Facebook page, group or profile was completely removed.

Facebook has hesitated to reveal details of its content review policy in the past. But the company says Tuesday's announcement is part of its promise to "do better" and be more transparent about how it decides what is maintained and what is withdrawn. The changes come weeks after CEO Mark Zuckerberg was questioned on Capitol Hill about alleged censorship of conservative Facebook views.

Several lawmakers asked Zuckerberg during his marathon testimony about "Diamond and Silk," two pro-Trump commentators who claim that Facebook intentionally limited their presence on the site due to their Political Views. Zuckerberg apologized and called the situation a "compliance error" by Facebook, but the controversy raised questions about what kind of content Facebook restricts and how it makes those decisions. Diamond and Silk are ready to testify before Congress this week.

Granular Standards

The recently released standards are a clear difference from the previous guideline of Facebook, which had been designed to express company values ​​and priorities in a way that did not overwhelm to the readers.

"We've always had a set of community standards that the public can see," Facebook vice president Monika Bickert told NPR's Steve Inskeep, "but now we're actually explaining how we defined those terms for our review teams and how we enforce those policies. "

Those new explanations are nothing if not complete. They detail dozens of reasons why publications can be removed, and read more as the product of a team of lawyers than the words of an upstart technology company. The standards describe the methods to categorize the content and provide specific definitions for terms such as "hate speech", "terrorism" and "threats of violence".

"People define those things in different ways," Bickert said, "and people who are using Facebook want to know how we define it and I think it's fair."

Some objectionable content is classified into levels, with Facebook's response matching the severity of the infraction. Other content is eliminated if it meets multiple conditions in a specific system. A threat of violence, for example, can be considered "credible" and eliminated if it provides an objective and "two or more of the following: location, calendar, method".

Other standards are directed to particular categories of offensive publications. It specifically mentions the content that "promotes, encourages, coordinates or provides instructions" for eating disorders or self-harm. And in its "harassment" section, Facebook says it will not tolerate claims that survivors of traumatic events are "paid or used to deceive people about their role in the event." Other rules prohibit advertising drugs, reveal the identities of the covert officers of the law and represent graphic violence.

Context Matters

However, when it comes to judging content, context is crucial. Facebook has been criticized in the past for its clumsy approach to community moderation. In 2016, for example, the company revoked its decision to remove a publication that contained the winning picture of the Pulitzer "napalm girl", which showed a child naked and burned in the Vietnam War.

Bickert says that example shows that exceptions are needed for culturally significant and worthwhile content.

Updated Facebook rules now list some exceptions for adult nudity depictions, including "acts of protest," "breastfeeding," and "post-mastectomy scars."

Still, questions remain in the Facebook content moderation program. Despite Zuckerberg's stated desire to use artificial intelligence to mark offensive content, the process remains very human. According to Bickert, the company has more than 7,500 moderators who are stationed around the world and work 24 hours a day, 7 days a week.

But conversations with those moderators paint a much worse image of Facebook processes than Bickert provides. In 2016, Aarti Shahani of NPR detailed a workforce composed mainly of subcontractors who are stationed in distant countries and were asked to review large amounts of messages each shift.

It is not difficult to imagine how someone located thousands of miles away, who grew up in a different culture, and who is under immense pressure to review as many publications as possible, could spoil.

The appeal of appeals

Facebook seeks to address that problem with its new appeals system. Now, if your publication is eliminated by "nudity, sexual activity, hate speech or violence," you will be presented with the opportunity to request a review.

Facebook promises that appeals will be reviewed in 24 hours by its Community Operations team. But it's still not clear what the team's relationship with Facebook and its front-line reviewers is. If the appeals are reviewed under the same conditions as the initial content decisions, the process may be nothing more than an empty gesture.

Facebook notes that the content review and appeals process is just one way to clean up your on-site experience. Users have the ability to block, stop following or hide messages or posters they do not want to see.

For the giant of social networks, it's about maintaining balance. Balance between freedom of expression and user safety. Balance between controlling "false news" and encouraging open political discourse. And the balance between Facebook's obligation to serve as an administrator of a welcoming environment and the realities of running a publicly-owned corporation for profit.

"We try to allow as much speech as possible," Bickert said, "and we sometimes know that it can make people feel uncomfortable."

Facebook says that Tuesday's announcements are just one step in a continuous process of improvement and adjustment to its standards and policies. It remains to be seen how much improvement this step represents.


Source link