VIDEO: Facebook post-scanning teams in Germany to get extra psychological care
A German Facebook contractor whose employees scour posts for graphic images received a surprise visit from Berlin's health and safety authorities, it has emerged.
Arvato, a subsidiary of media giant Bertelsmann, has provided content moderation services for Facebook since late 2015. At that point, the team included about 100 people. But now Arvato employs some 600 workers in Berlin, whose task is to respond to reports made by Facebook's users of illegal content on the social network.
The Arvato employees have to go through numerous posts, to judge whether they violate Facebook's community standards and need to be removed. That process means viewing posts that may contain things such as images of child sexual abuse and animal cruelty, as well as judging whether posts qualify as hate speech, which is illegal under German law.
Late last year, Germany's Süddeutsche Zeitung published an exposé into the working conditions at Arvato's Facebook unit. Based on interviews with former and current employees, the newspaper alleged that the workers receive insufficient support for the mental-health impact of what they have to do.
It also said they are paid barely above the minimum wage. Mobile Geeks also reported on the activities of the unit.
On Friday, Der Spiegel reported that the Berlin authority for worker health and safety, LAGetSi, had sent two inspectors to Arvato's office at the end of February, to investigate the unit's mental-health support arrangements and its employees' working hours. The inspectors reportedly took documents away with them.
However, the agency said it has so far found no grounds for formal action against Arvato. Der Spiegel reported that Arvato has improved its mental-health support arrangements in response to last year's anonymous allegations.
"Arvato takes the concerns and well-being of its employees at its subsidiaries seriously," the company told ZDNet in a statement.
"This includes providing comprehensive healthcare as well as supplementary care by company doctors, psychologists, and in-house social services. We have implemented high standards and a variety of measures, which we continually develop in an open dialogue with the employees and their representatives."
Facebook itself did not provide any comment on the matter.
The social network is currently under renewed attack in Germany over the strength of its content moderation, particularly when it comes to posts containing xenophobic and racist statements.
Justice minister Heiko Maas recently drafted a law that will hit social networks with fines of up to €50m ($54m) if they don't review reports of hate speech and remove the offending posts within 24 hours.
Those ground rules are what companies such as Facebook and Twitter already agreed to follow in late 2015, following criticism from Maas. However, the justice ministry is dissatisfied with the US firms' self-regulatory efforts, claiming that only a third of hate-speech posts are deleted within the agreed time.
If Maas's law passes, and if Facebook can't figure out the technology to deal with the problem through automated moderation, then the company's German post-scanning team will probably need to be beefed up even more.