Publishing Our Internal Enforcement Guidelines and Expanding Our Appeals Process

Angelica Greene
Апреля 25, 2018

Today we're going one step further and publishing the internal guidelines we use to enforce those standards.

The social network is also adding the ability for users to appeal a decision they feel may have been wrongly enforced.

In May, Facebook will launch Facebook Forums: Community Standards, a series of public events in Germany, France, the UK, India, Singapore, the USA and other countries where the company will get people's feedback directly.

Monika Bickert, VP Global Policy Management, and Brian Fishman, Global Head of Counter-Terrorism Policy of Facebook, revealed measures taken by the social platform to keep terrorists away. The company is also looking for feedback from Facebook users on the policies.

"You should, when you come to Facebook, understand where we draw these lines and what is okay and what's not okay", Facebook's vice president of product policy and counter-terrorism Monika Bickert, a former United States federal prosecutor, told reporters on Tuesday.

For the first time, Facebook said it would also introduce a mechanism that will allow users to appeal decisions to take down content. To combat that, Bickert says Facebook plans to launch an appeals process for posts that were removed for nudity/sexual activity, hate speech, or graphic violence. Appeals will be conducted by a "community operations" team within 24 hours. "For example, while we restrict some images of female breasts that include the nipple, we allow other images, including those depicting acts of protest, women actively engaged in breast-feeding, and photos of post-mastectomy scarring", Facebook says.

As of this moment, Facebook employs more than 7,500 content reviewers, which is up more than 40 percent from this same time a year ago. Recently, Logan Paul was harshly criticized for a video showing a person who had just committed suicide in Japan.

Facebook relies on artificial intelligence technology and reports from users to identify posts, photos, and other content that may violate its standards.

Quartz noted that some of the new rules were "clearly developed in response to a backlash Facebook received in the past". While on a road trip, the black activist had stopped at a Cracker Barrel, which has paid millions to settle lawsuits over racial discrimination, and posted, "At Cracker Barrel 4 the 1st time".

Читайте также: Langston Hughes Middle School Students Stage Walkout

How Will Facebook's Appeal Process Work? It also removed some of the accounts that attacked Oluo.

The guidelines are global and will be released in 40 different languages. As a result, our Community Standards can sometimes appear less nuanced than we would like, leading to an outcome that is at odds with their underlying goal.

Facebook is a bit vague on this one. Both sides had called for greater transparency. The restrictions apply to images of real people as well as digitally created content, although art - such as drawings, paintings or sculptures - is an exception. If prospective new projects to combat actions and subject matter deemed harmful are successful, then Facebook will probably continue to receive flak for doing too much. More than half the videos removed for violent extremism had "fewer than 10 views", it said. After all, Facebook has been accused of various forms of censorship in the past.

"We make mistakes because our processes involve people, and people are not infallible", Bickert said. That means Facebook deletes calls for violence or slurs that may be protected free speech in the USA under the First Amendment.

What's the context for Facebook's move? If the removal was done by mistake, the user will be notified of the same and the post will be restored.

Facebook executives say it's a hard balancing act to weigh what is acceptable expression and what is not.

Bickert acknowledged that Facebook's reviewers sometimes make the wrong decision.

During his testimony on Cambridge Analytica, Zuckerberg was asked by Sen. According to training slides on hate speech, for example, the statements "You are such a Jew" or "Migrants are so filthy" are allowed, but writing "Irish are the best, but really French sucks" is not. The number of content reviews has increased more than 40 per cent since previous year. "I just wish that there were a way. a process where we could more accurately reflect the values of the community in different places", he said.

"Determining if something is hate speech is very linguistically nuanced", he said. But he acknowledged there are challenges.

При любом использовании материалов сайта и дочерних проектов, гиперссылка на обязательна.
«» 2007 - 2018 Copyright.
Автоматизированное извлечение информации сайта запрещено.

Код для вставки в блог

Other reports by

Discuss This Article