This is how the censorship centers work on Facebook

This is how the censorship centers work on Facebook

"A Clockwork Orange" was a Stanley Kubrick film famous for being banned in many countries, due to its violent images and an ambiguous message that many could misinterpret. In it, an ultraviolent young man is captured by the authorities and exposed to a test to inhibit, in his personality, free will, that is, to apply a method to make him a good person. How did they do it? by means of a shock treatment: the subject is tied up in front of a giant screen , with his eyes permanently open, and is forced to watch unpleasant scenes for hours and hours. When he is released and feels the urge to commit violent acts, nausea occurs, conditioned by the situation in which he had found himself days before.

This is how the censorship centers work on Facebook

Now, imagine that you are watching violent, unpleasant images, hateful content, insults, videos of beheadings ... for eight hours a day for a large company. Your work at this center helps you determine what others can and cannot see. You work for one of the 20 centers that Facebook has arranged in the world to administer censorship in relation to content prohibited by Mark Zuckerberg's social network. has had access to sources from these centers in cities such as Warsaw or Lisbon, as well as documents that reveal the conditions under which these agents have to apply censorship.

According to these documents, the agents of morality and good conduct at Facebook have to work in appalling conditions . Windowless offices in which hundreds of people must continually fix their gaze on a screen where grotesque content is happening in order to eliminate it and not expand. They are, like Álex in 'A Clockwork Orange', subjects exposed to terrible images for eight hours a day without being fully appreciated what the consequences of such an act may be. The algorithms we talk about so much are nothing but people, like you and me.

facebook 02

Their physical and mental working conditions are extreme

A total of 15,000 people work in centers like that. Many of them claim that it is impossible to be 100% fair, that in the conditions in which they work, and due to the nature of the task, failures can happen continuously. Censorship workers on Facebook charge, each, 700 euros per month and are subcontracted through international consultants, it is forbidden to tell anyone that they work for Facebook , always having to refer to it as' the client. has obtained the statements of three employees of this type of center, having to maintain their anonymity.

The working conditions are extreme. They have half an hour to rest in their eight-hour day and they have to administer it every hour to take visual breaks, stretch their legs, go to the bathroom and even eat. In the room, there is always a reviewer who notes and punishes inappropriate behavior: an employee stopping, while selecting images, to drink water or take out his mobile to consult something is grounds for sanction. In addition, the employees themselves are encouraged, through a points program, to accuse each other if they see any of their colleagues engaging in punishable behavior.

This is how censorship centers work on Facebook 1

What is your job?

The work works as follows. The employee has in front of him a screen where all the content that the most complaints, by the user, accumulate are happening. The worker has, then, two options 'Delete', with which he deletes the content, 0 'Ignore', if he considers that he is not violating the rules of Facebook. Each person can analyze up to 600 cases a day and they have, to press one of the two decision buttons, thirty seconds in total for each one of them. Normal, of course, that injustices are committed, taking into account all that is now known about these centers.

The newspaper gives examples of real cases in which the interviewed employees have had to intervene. For example, this image appeared to them, an illustration that warns of breast cancer.


One of the algorithms, this one automatic, that has already started to work is that of the images in which female nipples can be seen. However, there are exceptions, as in the case that the images are informative . It seems clear that, seeing the illustration above, the employee must 'unlock' the image since it is information of general interest about cancer and is not an image of an erotic nature.

child bullying meme

In this other we see how the work that these moderators do can be so complicated. Is it a meme that represents an act of child bullying? The employee has to decide, within thirty seconds, whether to remove it or not. This is a snapshot of a real interview and obesity is not considered a disability. On Facebook, nothing is deleted that could offend anyone except that someone belongs to a group of disabled people. In this case the moderator would choose the 'Ignore' button.

The contradictions of censorship on Facebook

Two examples that may be more or less clear but that are included in many others where the ambiguity of the Facebook rules makes an appearance. For example, content that refers to Hitler is automatically deleted. However, content in which Franco's apology is made is allowed. One of the reviewers consulted claims that fascism is allowed on Facebook. You can fill your wall with photos of Mussolini that nothing will happen. But watch out for hitler . Or female nipples.

This is how the censorship centers work on Facebook 2

Among the most insane decisions Facebook censors have to make is considering the size of Arab men's beards to determine whether or not they are terrorists; Furthermore, the interviewees denounce a clear double standard in having to benefit specific groups. " Insulting certain beliefs is allowed and others completely prohibited ." For example, Zionism. Any calls for a boycott of Israeli products, due to the massacres that the country carries out in Palestine, are censored. In addition, the probability that content will be removed on Facebook has a lot to do with the power and organizational capacity of the group interested in having it disappear.

In the future, it will remain to be seen the psychological consequences that these Facebook employees will have as well as the peculiar ideological use they give to content moderation.