22.8 C
New York
Sunday, June 23, 2024

TikTok and Meta’s Moderators Form a United Front in Germany

Screening social media content to remove abuse or other banned material is one of the toughest jobs in tech, but also one of the most undervalued. Content moderators for TikTok and Meta in Germany have banded together to demand more recognition for workers who are employed to keep some of the worst content off social platforms, in a rare moment of coordinated pushback by tech workers across companies.

The combined group met in Berlin last week to demand from the two platforms higher pay, more psychological support, and the ability to unionize and organize. The workers say the low pay and prestige unfairly makes moderators low-skilled workers in the eyes of German employment rules. One moderator who spoke to WIRED says that forced them to endure more than a year of immigration red tape to be able to stay in the country.

“We want to see recognition of moderation not as an easy job, but an extremely difficult, highly skilled job that actually requires a large amount of cultural and language expertise,” says Franziska Kuhles, who has worked as a content moderator for TikTok for four years. She is one of 11 elected members chosen to represent workers at the company’s Berlin office as part of an employee-elected works council. “It should be recognized as a real career, where people are given the respect that comes with that.”

Last week’s meeting marked the first time that moderators from different companies have formally met with each other in Germany to exchange experiences and collaborate on unified demands for workplace changes.

TikTok, Meta, and other platforms rely on moderators like Kuhles to ensure that violent, sexual, and illegal content is removed. Although algorithms can help filter some content, more sensitive and nuanced tasks fall to human moderators. Much of this work is outsourced to third-party companies around the world, and moderators have often complained of low wages and poor working conditions.

Germany, which is a hub for moderating content across Europe and the Middle East, has relatively progressive labor laws that allow the creation of elected works councils, or Betriebsrat, inside companies, legally-recognized structures similar to but distinct from trade unions. Works councils must be consulted by employers over major company decisions and can have their members elected to company boards. TikTok workers in Germany formed a works council in 2022.

Hikmat El-Hammouri, regional organizer at Ver.di, a Berlin-based union that helped facilitate the meeting, calls the summit “the culmination of work by union organizers in the workplaces of social media companies to help these key online safety workers—content moderators—fight for the justice they deserve.” He hopes that TikTok and Meta workers teaming up can help bring new accountability to technology companies with workers in Germany.

TikTok, Meta, and Meta’s local moderation contractor did not respond to a request for comment.

Moderators from Kenya to India to the United States have often complained that their work is grueling, with demanding quotas and little time to make decisions on the content; many have reported suffering from post-traumatic stress disorder (PTSD) and psychological damage. In recognition of that, many companies offer some form of psychological counseling to moderation staff, but some workers say it is inadequate.

Most PopularBusinessThe End of Airbnb in New York

Amanda Hoover

BusinessThis Is the True Scale of New York’s Airbnb Apocalypse

Amanda Hoover

CultureStarfield Will Be the Meme Game for Decades to Come

Will Bedingfield

GearThe 15 Best Electric Bikes for Every Kind of Ride

Adrienne So

Though Kuhles, the TikTok moderator, declined to discuss her own pay, she said that many moderators make little more than the German minimum wage of €12 ($12.80) an hour.

The low pay, along with the categorization of moderators as “unskilled” labor under German employment rules, creates serious challenges for non-EU nationals working in Germany, as they are highly dependent on their employers for visas. Because moderation teams generally must handle material from a large swathe of the world, they often include immigrant workers.

One content moderator who works for an outsourcing company on contract with Meta in Germany tells WIRED, speaking on condition of anonymity, that during their three years working on violent content, they experienced PTSD symptoms, including vivid nightmares. Because the worker was worried that alerting the company to their symptoms might lead bosses to terminate their short-term contract, they said nothing and tried to manage alone. 

“I needed the job to support my visa process,” they say. “I couldn’t quit.” The process of obtaining permission to stay in Germany was made lengthier by the work being low-paid and not designated as skilled labor. “We are hoping the works council will step in and get these jobs designated as skilled labor, which will automatically increase the salary,” they say.

Martha Dark, director of Foxglove Legal, a UK-based nonprofit that has supported workers in the moderation industry, says that this kind of recognition is important not just for the people working to keep social media platforms free of harmful content, but for the platforms and their users.

“Without the hidden army of content moderators, the online world’s key safety workers, there is no Facebook, no TikTok, no YouTube, and no Google. No one knows better than them the steps that must be taken to keep us safe online,” Dark says. “Immediate steps must now be taken by the companies to keep workers safe. There is no excuse, and tech giants must make this right without delay.”

Related Articles

Latest Articles