Wednesday , November 20 2019
Home / netherlands / Visit Facebook: these people keep your timeline clean

Visit Facebook: these people keep your timeline clean



Every day, two million messages, photos and videos on Facebook and Instagram are seen by what is called a content moderator. They see child pornography, bullying, and violent videos, so we don't need to see it on our timeline. RTL Z visited the Dutch people who made their work out of there.

"You are used to violence," said a content moderator of the young woman we spoke on Facebook in Barcelona.

He and his colleagues are served every day a stream of post violence. They must assess messages, photos and videos according to Facebook's home rules.

What happened was a message that was reported by Facebook and Instagram users. But it can also be messages that take artificial intelligence.

Think of reports where someone is being bullied, naked videos, terror videos or images that are so rude.

The stupidest things

What the moderators see is a surprise every time. "People report the most stupid things, for example that they don't like their girlfriends favored," another moderator said in a conversation with RTL Z and a number of other Dutch-language reporters, who for the first time looked behind the scenes at the CCC.

The CCC is one of the companies hired by Facebook to assess all reported people. According to social networks, there is at least one center in each time zone, so you can intervene 24 hours a day throughout the world.

& # 39; You are familiar with it & # 39;

Reported photos, messages and videos can be very innocent. But the moderators also saw pornographic videos and images of people who had suicidal tendencies.

"It won't be easy to see terrible things being put in. But if you see it more often, you will get used to it a little," said the boy who led the Dutch team.

Can't be thumbs up

If RTL Z reports on the CCC building, there is no place to see that work for Facebook is being done here. In one of the tallest buildings in Barcelona, ​​the company rented eight floors where several hundred people had to keep Facebook, Instagram and Facebook Messenger clean.

Also on the facade of the building you see wherever the name or thumb of Facebook and in the hall also shows nothing. The entrance looks dull, with some colorless sofas, a fake bookshelf and some sad plants to cheer him up.

Figure © RTL Z

"We are not talking about these places & # 39;

We are clearly instructed to enter that we are not permitted to film, except in locations determined by Facebook.

"We are not talking about these places," explained Dave Geraghty of Facebook, Director of Global Market Operations Facebook. This is among other things to ensure employee safety, he said.

Geraghty mentioned the shooting at YouTube headquarters in April as an example. "It was an attack on YouTube people, because someone did not agree with the decision to take videos offline, and we consider our security to be very serious, which is why we did not disclose it."

CCC employees are lining up for elevators
Figure © RTL Z

Number

Only since this year, Facebook has come up with numbers about how often companies must intervene in messages on social networks. In total, the view of content moderators – every day there are 15,000 – more than 2 million messages per day.

Within a year, Facebook must intervene with more than 3.9 billion messages. Spam is by far the largest category (3.751 billion).

The latest figures for July, August and September show that Facebook has intervened in almost 1.3 billion messages. The biggest part of this is deleted spam messages (1,271 billion). In addition, it involved 30.8 million sexist messages, 2.1 million bullying and 15.4 million messages of violence. 1.5 billion fake accounts are also taken offline.

Figure © RTL Z

Still people work

Facebook is also increasingly using artificial intelligence. It can recognize images, videos and messages beforehand, before Facebook users can see it at all. For example, he saw 99 percent of all terrorist messages before someone reported them.

In addition, artificial intelligence itself automatically removes many messages. For example, he learns to recognize spam and can quickly delete it. Previously deleted photos are easily tapped before they come online and people have to see them.

"In the end, we want artificial intelligence to do all the repetitive work," said Siobhan Cummiskey, head of public policy on Facebook. However, intervention in all these messages remains an important part of people's work.

"We see that content must be seen by real people for more nuanced things, such as hatred and intimidation."

But that doesn't always work well. Because one in ten times a moderator chooses to delete a message is wrong or wrong, CEO Mark Zuckerberg said two weeks ago. "There is always something we can do, we have invested heavily in technology," Cummiskey said.

A moderator is working
Figure © RTL Z

Spanned, drinks and drugs

By inviting the media to look behind the scenes, Facebook wants to help with misunderstandings. There is no unnecessary luxury, because many reports from former central employees like this emerged last year.

And the picture they made is not bright. People become overwhelmed by the images of violence they see. And drinking and drug use, according to reports, the order of the day, only to deal with the stress generated by work.

Two months ago Facebook was sued by former employees who struggled with PTSD because of the images of violence they saw. His complaint: the company did not provide enough guidance to handle this.

Execution video

Earlier this year, a former Sjarrel employee fainted from school, he preferred not to mention his family name. He worked in a similar center in Berlin and saw very strong images during his work. A number of images remain with him. There are videos of rape with violence and videos of IS executions.

"It was a picture of a man in an orange overalls handcuffed, trying to escape and curl up on the asphalt road."

The man is run over by a tank. "After that, his body is enlarged."

After seeing the video, Sjarrel had to come out to calm down again. But when he returned, he was not taken care of by a manager. "That is a sign for me: I have to leave here, because you don't need to expect support here."

Psychological assistance

The head of public policy found it difficult to hear that this type of message came out. "Because we spend a lot of time and energy caring for our content reviewers," Cummiskey said.

He stressed that employees can receive psychological assistance 24 hours a day. Five people are available for this in Barcelona. In addition, they can always stop when they have a hard time, playing PlayStation or table football in large spaces that have been specially equipped for this, according to Cummiskey.

During our trip through the building, we also saw space. When we started shooting there, two people played FIFA. We can make films, but once again they cannot be recognized in the picture.

Two employees relax with FIFA games
Figure © RTL Z

Different images

The five content reviewers that we speak do not recognize themselves in the picture that is Sjarrel sketch. The question is how freely they speak. The CCC boss is in conversation with the content moderator and questions and answers must be given in English.

The response from one of the employees is clear. "There is no problem with drugs here." Others added: "I don't recognize images, more than what they are talking about."

During group discussions with content reviewers, the boss jumped twice. One time to say that the number of people leaving was very low.

"People go for better paid jobs, but it's not because of the nature of the work, they always feel it's meaningful, but feel free to know if it's different," he said.

One of the men who had done this work for a half-year supplement and said that he felt valued. "We see a lot of things that hurt. But for our market it's not too bad, compared to others. We feel that we are valued."

Out of 70 Dutch people working at the CCC in Barcelona, ​​only one is left to do something else.


Source link