The internal rules of Facebook about sex, terrorism and violence

Secret rules and regulations of Facebook, in accordance with which the moderators determine what 2 billion users of this social network may or may not publish on the website, first came into the hands of the Guardian reporters. They investigate, which is likely to cause a new wave of disputes about the role and ethical principles of this giant social network.

In the hands of the Guardian reporters were over 100 internal training manuals, tables and graphic charts that allow you to get acquainted with the rules according to which Facebook is moderated on sensitive topics such as violence, hate, terrorism, pornography, racism and causing harm to yourself. These instructions have their own rules concerning contractual matches and cannibalism.

These “documents of Facebook” (“Facebook Files”) allow us to read the instructions and rules formulated by the management of the site which is now under strong political pressure in Europe and the United States.

The content of these documents reflects the difficulties faced by leaders trying to respond adequately to new challenges, such as, for example, “revenge porn”, as well as the challenges faced by the site moderators are so overworked that often the decision they have “no more than 10 seconds.”

“Facebook is not able to control your content, noted one source. — He too has rapidly grown and reached giant size”. According to rumors, many moderators are dissatisfied with the inconsistency and strangeness of certain rules. Especially complicated are the rules relating to content of a sexual nature.

One of the documents says that the moderators of Facebook accounts weekly to check for 6.5 million of false accounts, the so-called FNRP (fake, not real person).

Guide Facebook included instructions, which included several thousand slides and illustrations and which may raise concerns among critics who believe that Facebook, as a major publisher, needs to pay more attention to the removal of violent, harmful and dangerous content.

However, these instructions can also to alarm the defenders of freedom of speech, convinced that Facebook actually already has become the world’s largest censor. Both sides likely will require an increase in transparency.

Guardian reporters reviewed the documents that were handed over to the moderators of Facebook in the past year. According to these documents, the moderators of the site should:

To delete such remarks as, for example, “somebody shoot trump” because he is the head of state and his name belongs to the protected category. Meanwhile, the moderators can skip statements such as “to roll bitch neck, all the power must be applied to the Central part of the throat” or “fuck off and die” because such statements are not considered a real threat.

— Captured on video scenes of violent death are considered to be a violation, but at the same time, not all of them need to be removed because they can increase the level of awareness of such important problems as mental disorder.

— Some of the photos, which depict scenes associated with sex physical violence and child abuse, one need not required to remove if they are not elements of sadism and glorification of violence.

— Photographs depicting scenes of animal cruelty, it is possible to share, and a special tag “disturbing” (“traumatic”) need to put on the most shocking of them.

— “Hand made” images of nudity and sexual activity permitted, while made with a digital image of a scene of sexual activity — no.

Video of abortion acceptable if they have no nudity.

— Facebook will also allow you to broadcast live scenes in which people hurt themselves, because he “doesn’t want to censor or punish people in distress”.

— The owner of the account in the social network, number of subscribers, over 100 thousands considered a “public figure” and deprived of the protection given to individuals.

Other types of statements that can be missed by moderators in accordance with these documents, include remarks like “a little girl should behave quietly, while dad punched her in the face” or “I hope you get killed”. These threats too generic and impersonal and should not be considered real.

In one of these documents are the user of Facebook acknowledged that “people use sharp language and swear words to Express network the discontent and irritation” and that they “feel safe” doing it.

The document States: “They assume that what they say will never return to them, and they don’t have any feelings against the man, against whom they Express threats, due to the lack of empathy, due to the fact that people communicate through electronic devices instead of face to face.”

“It is worth noting that the aggressive statements usually do not represent a real threat, unless the nature of the language gives us strong reasons to suspect that it is not just the expression of some emotion, but nothing of any particular design or plan. From this point of view, expressions like “I’ll kill you” or “fuck off and die” is not a real threat, but only expressions of discontent and irritation”.

The document also States: “People often Express contempt and disagreement with threats or calls to violence that are frivolous and even humorous in nature.”

Facebook acknowledges that “not every unpleasant and disturbing content violates our community standards”.

Monica Bickert (Monika More), head of global policy for Facebook, said that this social network has almost 2 billion users, so to achieve a consensus about what to publish and what is not, is extremely difficult.

“The inhabitants of the planet are extremely different views, so they may have very different ideas about what to share and what is not. Wherever we dividing lines will always remain grey areas. For example, the boundaries between humor, satire and inappropriate content are often very blurred. It is sometimes difficult to determine whether you can publish certain types of content on the site,” she explained.

“We understand that we are responsible for the security of our society. We are aware of this responsibility. It is our duty to provide security. This obligation of our company. We will continue to invest in making our site was safe, and we encourage all users to inform us about the appearance of content that violates our standards.”

According to her, there are offensive comments that may violate the policy of Facebook in one context and not to violate in another.

Fell into the hands of reporters the Guardian the documents relating to the moderation of the content of violent themes, including scenes of violent death, child abuse is non-sexual nature and cruel treatment of animals, demonstrate how the management of the site is trying to walk that minefield.

The document States: “Video of scenes of violent death are shocking, but they can help to raise awareness. As for videos, we believe that minors need protection, but adults — in choice. We put special marks on movies with scenes of violent death of men.”

These videos should be “hidden from minors”, but they should not be removed automatically because they can “prove valuable in the process of raising awareness in issues such as self harm, mental disorders, war crimes and so forth.”

With regard to child abuse is non-sexual nature, in the documents, Facebook says: “We do not remove photos of scenes of violence against children. We put a special note “disturbing” videos with scenes of violence against children. We remove photos and videos with scenes of violence against children if they include elements of sadism or glorify violence.”

On one of the slides explains that the moderators of Facebook does not automatically remove evidence of child abuse is non-sexual nature, to allow users to disseminate these materials and to help “to identify and rescue the child, but we put special marks to protect the audience.”

Guide Facebook has confirmed that “there are times when we miss the materials with scenes of violence against children is non-sexual nature, to help these children.”

As for site policy, the issue of animal cruelty, one of the documents reads as follows: “We are missing photos and videos with scenes of animal cruelty, to raise awareness, but we can put special marks to protect the audience if the content is excessively shocking”.

“In General, users can share photos and videos with scenes of animal cruelty. On some particularly shocking materials can be special mark.”

Pictures of mutilated animals, as well as photographs with scenes of torture, are more likely to be marked as “traumatic” and not removed. Moderators can also skip pictures of violence to animals on which man is beating an animal.

In the documents, Facebook says: “We allow users to share photos and videos with scenes of violence against animals, to raise awareness and denounce violence, but we remove the content which the abuse of animals is praised”.

These documents indicate that the management of Facebook has released new instructions regarding the naked body after last year’s scandal due to the fact that the moderators deleted the photo, taken during the Vietnam war, because the girl she was naked.

Now, according to the rules, moderators can make “exceptions worthy of press coverage of materials” relating to “horrors of war”, but they do not allow images “of naked children’s bodies in the context of the Holocaust.”

The Facebook user told the Guardian that the moderators of the website use a special program to intercept the graphic content before it goes to the site, but “we want people to have the opportunity to discuss current events in the world… so sometimes the context in which users share images with scenes of violence, is of great importance”.

Some critics in Europe and the United States require that, in respect of Facebook acted the same rules that apply to print media and publishers.

But Beckert says that Facebook “is a new type of company. This is not a traditional technology company and not a traditional media company. We create technology and are responsible for how it is used. We don’t write the news that people read on our platform”.

The report, prepared by British parliamentarians, which was published on may 1, States: “the largest and Most wealthy social network shamefully far from being able to take adequate measures to prevent the proliferation of illegal and dangerous content, to implement appropriate standards and to ensure the safety of their users.”

Sarah Roberts (Sarah T Roberts), an expert in matters of moderation of content, said: “it’s One thing if you are a small online community that includes people of similar views and principles, but if the users of your social network account for a significant share of the world population, and you allow each of them to share their thoughts, you will inevitably encounter problems.”

“And when you’re monetizing this practice, you will face a disaster”.

Facebook has long been trying to find the evaluation criteria of news value or values to raise awareness of pictures and videos with scenes of violence. Although the company has recently faced a wave of criticism after its moderators have not deleted the video with the scene of the murder of Robert Goodwin (Robert Godwin) and killing the father of her child in Thailand, this platform played an important role in the dissemination of videos about the killings perpetrated by police officers and other crimes of the authorities.

In 2016, the Facebook moderators removed the video, which was shown a scene immediately after the murder Filanda of Castile (Castile Philando) the police, however, later the video returned to the site, stating that the deletion was a mistake.

Comments

comments