For months, Tassos was banned from speaking about his job in the southern Athens suburb of Moschato. His task was determining whether content put up on Facebook should be removed. In the few seconds until the next post would pop up on his screen, he would feel his chest tightening. If he had the slightest doubt about his judgment, he had to consult an online rulebook indicating the crudest violations of community standards. There, he would encounter imagery of hard-core porn and animal torture. “I can still recall a picture depicting two little dogs that were hung from their front feet,” Tassos told Kathimerini. But the line between inappropriate content and censorship was not always clear to him. “The guidelines set by Facebook were nebulous and the phrasing was sloppy. We often did not know whether we were making the right call,” he said.
Tassos was one of the hundreds of Teleperformance employees that work in Athens as content moderators for Facebook. These people determine what is inappropriate or misleading and needs to be deleted from the platform. Unlike so-called “cleaners” in other parts of the world, the Athens-based staff do not moderate controversial content shared by users on Facebook as personal posts, but only ads. During training, Tassos was told that, besides Greece, Facebook ran two more Business Integrity teams, in Portugal and Malaysia.
About 1.59 billion users are active on Facebook daily, making it one of the most popular global tools for advertising, news-sharing and self-promotion. The platform is now in a position to set its own terms and conditions on public discourse. However, it’s not always clear how content deletion policy is implemented or by whom. Following a months-long investigation, Kathimerini has been able to crack the wall of silence surrounding the global tech giant’s content moderation right here in Athens, in a multistory building off Pireos Street.
Facebook is tight-lipped about the Athens-based team. “We have thousands of people, who have broad language expertise (including Greek), working across the globe, 24/7,” Jan Sciegienny, Facebook’s corporate communications manager for Central and Eastern Europe, wrote in an emailed response to Kathimerini on 28.06.2017 as he declined the newspaper’s request for an interview. Three more similar requests were turned down on 03.11.2017, 27.03.2018 and 26.06.2019, the last one involving a visit to one of the offices of Teleperformance in Greece.
The Greek connection
Facebook has been working in Athens with Teleperformance, a France-based multinational specializing in outsourced omnichannel customer experience management, since at least September 2018.
“We work in partnership with Teleperformance in Athens to facilitate our policies on advertising and to support other business activities. We work closely with them, like we do with all our partners in order to make sure that their teams are well taken care of and that they have access to all the support they might need. We ask these teams to sign a non-disclosure agreement so as to protect data that are in need of moderation,” a Facebook representative told Kathimerini (the message was sent in Greek by the public relations firm representing the company in Greece). Teleperformance did not respond to questions about this report.
Facebook’s Business Integrity team in Greece employs an estimated 800 staff (employees cover a wide range of nationalities including Norwegian, Finnish, Turkish and Israeli) that address needs in dozens of countries. Greek moderators often work on content published in other languages including Russian, French and German. Tassos used to be a part of that team. Kathimerini first tried to contact him in February before he finally agreed to open up about his experience over two meetings. There was so much that Tassos (not his real name) wanted to share that one meeting was not enough.
A Teleperformance LinkedIn ad looking for Norwegian speaking moderators.
His job contract, which has been seen by Kathimerini, confirmed that he worked for Facebook. According to the job description, he provided call support and “services primarily in the Facebook customer service department.” Tassos explained that his work did not include telephone calls. He was prohibited from discussing his post with anyone and he was not allowed to reveal the identity of his employer on social media (Facebook, Twitter, LinkedIn, Instagram).
A second person, also a Greek national, who still works for the company, had originally agreed to speak to us for this report but changed his mind about an hour before the meeting. He said he was afraid that his testimony might negatively affect his colleagues.
Tassos found himself at the offices of Teleperformance after responding to a job posting. During a brief interview, he was allegedly informed that the post involved the possibility of being exposed to violent imagery. He went through three weeks of training before starting work at Moschato.
Driving along Pireos Street, it’s easy to miss the company premises. A cluster of abandoned buildings at the front and high walls block most of the view inside. The other nearby streets are full of car repair shops, delivery services and machine shops – nothing to suggest that a Facebook department might be based in the neighborhood. Only the careful observer will spot the modern building complex that hosts hundreds of workers from street level.
As soon as he arrived at the office every morning, Tassos would leave all his personal items, including his cell phone, inside a locker on ground level. “On our desk, we were only allowed to have water or coffee in a transparent cup. We were not even allowed to have tea bags, or any other piece of paper on which we might write a note,” he said. To get to his office, he had to use a swipe card to enter the elevator and again to access the floor level where his department was located. After he got there, he would sit down at one of the large desks, each of which accommodated eight employees.
Tassos was among a group of lower-ranking employees that shouldered most of the workload, known as agents or moderators. “Informally, we had to screen 100 posts per hour. But there wasn’t any pressure or pay cuts if we didn’t,” he said. Posts waiting for moderation also included certain resolved cases, called “golden standards,” which were used to assess their judgment. He worked eight-hour shifts, with a 45-minute, strictly timed break so as not to disrupt production. He said that as the shift drew to a close, some of his co-workers would simply approve or reject content without further deliberation. Their aim, he said, was to meet a desirable target.
But what is classified as inappropriate commercial content? Tassos said the staff would take down ads about gambling, arms trading, pharmaceuticals, drugs and deceptive weight loss ads. “For some time, [ads of] designer knockoffs were high on the agenda. Later on, removing pornographic content became our main priority,” he said. Part of his job was inspecting the links to an ad, as they often directed visitors to porn sites.
Apart from the more obvious violations, an image of a female breast could also be considered pornography if it depicted the areola or was too explicitly portrayed. The company manual contained specific examples that looked straight out of a police vice squad catalogue. According to Tassos, some examples would even measure the exact centimeters of exposed flesh.
“We would take down works of art if they depicted the areola,” he said, recalling the case of a foreign beauty treatment center that had posted an ad featuring a Renaissance painting depicting a nude woman. In 2018, Facebook infamously censored the prehistoric “Venus of Willendorf” figurine. A Facebook representative later apologized for the mistake.
The ''Venus of Willendorf'' figurine.
The job also means that moderators are often exposed to depictions of violence. “The worst thing that can happen to you is having to deal with a low-resolution video where you have to watch footage again and again to make sure whether it shows a rape incident or not,” Tassos said. He explained that three business psychologists were available at the site. Employees could book one 30-minute session every two weeks. Although he had used the service, Tassos did not feel comfortable about it, concerned that his conversations might leak and result in his dismissal.
Foreign media have investigated the long-term impact of cruel imagery on the mental health of content moderators. “There was literally nothing enjoyable about the job. You’d go into work at 9 a.m. every day, turn on your computer and watch someone have their head cut off. Every day, every minute, that’s what you see. Heads being cut off,” an anonymous cleaner told the Guardian in 2017.
Three former content moderators in the United States, Selena Scola, Erin Elder and Gabriel Ramos, who worked for Facebook via third-party vendors and contractors have filed a lawsuit against the company claiming they suffered from psychological trauma and post-traumatic stress disorder as a result of being exposed to violent images. One of them claims to have suffered from depression and recurring nightmares. In March, the lawsuit was lodged with a California superior court.
According to a 25-page legal document seen by Kathimerini, the plaintiffs claim that they did not receive proper training or psychological support while working for the company. They quote studies showing that people who are systematically exposed to images of paedophilia, such as cybercrime police officers, may experience similar symptoms of PTSD. Furthermore, they say that the non-disclosure agreements signed by employees intensify the psychological trauma, preventing them from sharing their experience with others. Facebook claims that the non-disclosure agreements are aimed at protecting moderators from unhappy users and preventing leaks of personal data.
Lawyer Konstantinos Kakavoulis, a member of nongovernmental organization Homo Digitalis for the protection of internet users in Greece, raises two key concerns regarding the secrecy that surrounds Facebook’s services in the country. “A first issue concerns the selection process and the cognitive level of the individuals who perform this type of work,” he says. “A second concern is that wrong decisions made due to pressure could subsequently form the basis for educating an AI system that will perform the same task for Facebook in the coming years.”
The contract Tassos signed with Teleperformance to work as a Facebook moderator in Greece.
Tassos says that he took on the job to make a living. On the basis of his contract, he received a monthly salary of around 1,000 euros. He says that pay levels depended on nationality and language. Those who worked at the English-speaking and Greek-speaking departments appeared to make less money. “I never made more in any other job. I felt like a king,” he said. With time, however, Tassos felt that the nature of the job was wearing him down.
“If you wanted to be productive, you had to keep your eyes on the screen,” he said. “It was a lot of pressure.”