The rules and guidelines-so far secret-that Facebook uses to decide that we see the 2 billion users, that we can publish and that have not just been made public for the first time thanks to a work of The Guardian investigation.Far from burying the controversy, the more than one hundred internal manuals that have just come to light will feed the debate on the ethics of the social media giant.
How do you decide to censor it? Facebook has a large team of moderators who use a series of algorithms and are responsible for implementing the company's guidelines to decide what comes in and not on our wall.
moderate the contents of facebook
Specifically, the style guide includes some ethical decisions-which many already discuss-to determine if texts, images and videos on sensitive topics such as sex, violence or terrorism should appear in the most popular social network in the world.
According to those responsible for the research work published in The Guardian, Facebook has grown too much and too fast so it can no longer maintain control over its contents.
In addition, as Xataka has published, some of the company's own moderators believe that the rules imposed by Facebook fall into inconsistencies and are not consistent, for example, while a phrase that encourages Trump to shoot must be removed, You can share a threat like: '' Fuck and die.''
Sex, violence and terrorism on the table
Specifically, the most sensitive scenarios that appear in the Facebook guide are those related to sex, violence and terrorism, although sometimes it is difficult to set the limit, since it is not easy to decide which videos of violent deaths are allow and which not because according to social network experts, there are contents that help raise awareness about certain mental illnesses.
Although it is difficult to believe, photographs of physical abuse of minors are also not censored if there is no sadistic or celebratory component, and images of animal abuse can be shared unless there are extremely violent images.there is more: videos of an abortion are accepted if the body does not appear naked and Facebook accepts that it is broadcast live as someone self-inflicts damages to `` not censor or punish people in danger ''.
Another of the most controversial decisions of Facebook is that admit for example expressions in favor of the death penalty or those that express that the author enjoys the execution of the death penalty in the United States.
The size of the social network does matter
Facebook is a social network with more than 2,000 million users, which is a big problem to carry out the work of moderation of contents.Moreover, Monika Bickert, maximum responsible for the management of the global policy of Facebook, highlighted in The Guardian that it is very difficult to reach a consensus on what to allow and what not.
Why is it so difficult? Because the Facebook community is very diverse and, therefore, different users have very different ideas about what is right and what is not.In Bickert's opinion, does not matter where you put the limits because always there will be gray areas and polemic themes. For example, the line that separates humor from inappropriate content is sometimes too narrow, so it is difficult to decide whether it is suitable for Facebook or not.
So that you also contribute to the tasks of moderation , Facebook has prepared a small visual test so that signals which images you would delete and which not of your news flow.Is it so complicated to apply criteria that respect privacy and internet security?
Photography: AllTheFreeStock.com
Comments
Post a Comment