Top

‘I Would See People Get Shot in the Face:’ TikTok Ex-Moderators Sue Over On-the-Job Trauma

A small army of overworked content moderators is the public’s last line of defense against a flood of depraved and horrific content uploaded to social media. While the moderators help us normal users avoid the worst of the worst, constant exposure to humanity’s darkest impulses can wreak havoc on their mental health. Now, two former moderators at TikTok are alleging the quickly growing social media giant skimped on mental health treatment for moderators struggling to deal with an onslaught of digital nightmares.

Advertisement

“I would see people get shot in the face,” one said in an interview with NPR. “And another video of a kid getting beaten made me cry for two hours straight.”

The lawsuit, filed Thursday, claims TikTok and its parent company ByteDance violated California labor laws by not ensuring moderators were adequately protected from emotional trauma caused by exposure to the images. Moderators, the suit claims, act as “the gatekeepers between the unfiltered, disgusting, and offensive content uploaded to the app and the hundreds of millions of people who use the app every day.” The suit specifically accuses TikTok of negligence, negligent exercise of retained control, and violations of California’s Unfair Competition law.

“Defendants [TikTok and ByteDance] are aware of the negative psychological effects that viewing graphic and objectionable content has on moderators,” the suit argues. “Despite this, defendants fail to implement acknowledged standards of care to protect moderators from harm.”

By now, it’s no secret content moderators have to interact with some awful material, but the suit claims moderating TikTok is actually worse than other platforms. While other firms put in place some harm mitigation efforts recommended by industry groups such as utilizing filtering technology to distort images or provide mandatory counseling for moderators, TikTok does not, according to the suit.“This unmitigated exposure and callousness towards implementing standards of care, resulted in Plaintiffs [the moderators] being exposed to thousands of graphic and objectionable videos, including graphic violence, sexual assault, and child pornography,” the suit alleges.

Former TikTok moderators Ashley Velez and Reece Young claim they didn’t relieve proper mental health treatment, which in turn led to an unsafe work environment. The two moderators claim they were exposed to a laundry list of the most horrific shit on the internet: Videos of child pornography, rape, bestiality, murder, beheadings, and suicide crossed their desks.

Young reported witnessing a 13-year-old girl be executed by a cartel member on camera. Velez told NPR images and videos involving underage children accounted for the vast majority of the troubling content she was exposed to. “Somebody has to suffer and see this stuff so nobody else has to,” Velez told NPR.

Advertisement

The suit claims demanding productivity quotas imposed on workers are “irreconcilable with applicable standards of care.”

According to the suit, moderators are told to review videos for no longer than 25 seconds and make judgments with more than 80% accuracy with whether or not a given video violates TikTok’s rules. Within that 25 seconds, moderators have to think through and consider 100 possible tags that could potentially be applied to label problematic content, the plaintiffs said. Moderators work a 12-hour shift with one-hour lunch break and two fifteen-minute breaks, the suit said.

Advertisement

“By screening social media posts for objectionable content, content moderators are our frontline soldiers in a war against depravity: a war we all have a stake in winning,” Steven Williams, one of the attorneys representing the TikTok moderators said in a statement. “The psychological trauma and cognitive and social disorders these workers face is serious. But they are being ignored, and the problems will only grow worse—for the company and for these individuals.”

TikTok did not respond to Gizmodo’s request for comment.

Advertisement

On top of that mountain of digital terror, the suit claims moderators are also regularly exposed to torrents of conspiracy theories and misinformation, particularly around Covid-19, which their lawyers argue also causes traumatic reactions.

It’s worth noting Velez and Young were both contractors working through the companies Telus International and Atrium Staffing Services respectively. Though both of the moderators technically work for separate companies, the lawsuit still seeks to hold TikTok and ByteDance responsible, arguing it is the one that sets quotas, monitors workers, and is in charge of disciplinary actions. Though a Telus International spokesperson told NPR they do provide mental health counseling for contractors, Velez claims it’s wildly insufficient. The moderator said she had just one 30 minutes meeting with a counselor who appeared inundated with requests from other distressed moderators.

Advertisement

Through the lawsuit, the moderators’ lawyers are hoping to win financial compensation for Velez and Young and pressure TikTok to provide mental health screening and treatment to its thousands of current and former content moderators. Gizmodo reached out to the firm for comment but has not heard back.

The moderators claimed, as is the case with many other moderators at rival tech firms, that they were required to sign non-disclosure agreements preventing them from discussing the images they saw. After a day of toiling through humanity’s darkest recesses, the workers then have to bury those stories, unable to speak about them with friends or even family.

Advertisement

“​​They saw so many people that it didn’t seem like they had time to actually help you with what you were suffering with,” Velez told NPR.

While TikTok, like other major content providers, deploys artificial intelligence to capture the bulk of problematic content, the frantic flood of potentially harmful content being uploaded to their sites means human moderators remain indispensable. These moderators, in general, are often independent contractors working for lower pay with less job security and fewer benefits than workers employed at tech companies.

Advertisement

Researchers out of the University of Texas and St. Mary’s University released a paper last year chronicling the academic literature on content moderators and found ample evidence of repeated exposure to harmful content leading to PTSD and other psychological harms.

“While moderation work might be expected to be unpleasant, there is recognition today that repeated, prolonged exposure to specific content, coupled with limited workplace support, can significantly impair the psychological well-being of human moderators,” the researchers write.

Advertisement

In other cases, moderators at Youtube and Facebook have reportedly been hospitalized for acute anxiety and depression following repeated exposure to the content. And unfortunately for everyone, the internet isn’t getting any less fucked up. Just this week the National Centre for Missing and Exploited Children said 29.3 million items of child sexual abuse material was removed from the internet last year. That’s a record and a 35% increase from the amount of material removed a year prior.

The mental health struggles plaguing content moderators throughout the tech industry have gained public attention in recent years thanks to an outpouring of revelatory reports and other legal action. Numerous media outlets have documented the often shocking working environments for moderators at Facebook and Youtube, though comparable little has been written about TikTok moderators.

Advertisement

Two years ago, Facebook settled a lawsuit brought against them by thousands of moderators for $52 million. The same law firm representing Velez and Young also represented the Facebook moderators. That settlement originally stemmed from a 2018 lawsuit filed by Facebok moderator Selena Scola who claimed she had developed PTSD after viewing instances of rape, suicide, and murder during her job. The $52 million settlement was dispersed among thousands of contracts with each receiving at least $1000 in compensation. A former Youtube content moderator also sued her employer back in 2020, claiming she developed depression and symptoms associated with PTSD after viewing beheadings and child abuse imagery. It’s only fitting that TikTok, one of the fastest growing social media sites online, would also find itself on the receiving end of litigation.

Advertisement