Skip to Content

TikTok hit by another lawsuit over working conditions for its content moderators

<i>Adobe Stock</i><br/>TikTok is being sued by former content moderators who claim the job traumatized them.
nikkimeel - stock.adobe.com
Adobe Stock
TikTok is being sued by former content moderators who claim the job traumatized them.

By Clare Duffy, CNN Business

TikTok has been hit with another lawsuit from former content moderators who claim the job traumatized them.

Ashley Velez and Reece Young, former contract content moderators for TikTok, allege that their work involved reviewing “unfiltered, disgusting and offensive content,” including “child sexual abuse, rape, torture, bestiality, beheadings, suicide, and murder,” according to a complaint filed Thursday in a California district court against the popular short-form video platform and its parent company, ByteDance. They accuse the company of negligence, alleging that it failed to provide adequate care to protect moderators from harm and support them after reviewing such content.

“By requiring content moderators to review high volumes of graphic and objectionable content, Defendants require content moderators to engage in abnormally dangerous activities,” the complaint alleges, adding that the company is “failing to implement acknowledged best practices to mitigate risks necessarily caused by such work.”

TikTok did not immediately respond to a request for comment.

This is the second recent lawsuit alleging that TikTok fails to adequately support its content moderators.

Contractor Candie Frazier — who was represented by the same firm as Velez and Young — filed a lawsuit in December against TikTok and ByteDance, alleging she had developed anxiety, depression and posttraumatic stress disorder as a result of her work reviewing disturbing and violent content on TikTok. At the time, a TikTok spokesperson said the company would not comment on ongoing litigation but that it offers “a range of wellness services so that moderators feel supported mentally and emotionally.”

“We strive to promote a caring working environment for our employees and contractors,” the TikTok spokesperson said in December. “Our Safety team partners with third party firms on the critical work of helping to protect the TikTok platform and community.”

Frazier dropped her suit last month and is considering her options, according to her lawyer.

Thursday’s lawsuit comes amid increased scrutiny of content moderation practices at TikTok and other social media platforms, which has only increased as false claims and conspiracy theories spread about the war in Ukraine. Earlier this month, a nationwide group of state attorneys general launched an investigation into TikTok’s user engagement practices and elleged potential harms of the platform for young people. TikTok said in a statement about the investigation that it limits its features by age, provides tools and resources to parents, and designs its policies with the well-being of young people in mind.

TikTok had previously flown under the radar compared to larger rivals such as Facebook and YouTube, but has gained attention in recent months from critics and lawmakers after exploding in popularity, especially among young people, during the pandemic. The company said in September that it had reached 1 billion monthly active users. TikTok said last month it would strengthen efforts to regulate dangerous content, including harmful hoaxes and content that promotes eating disorders and hateful ideologies.

Velez and Young were not TikTok employees; instead they worked remotely for staffing firms that supply contractors to work as content moderators for the platform. Young worked as a TikTok moderator for New York-based Atrium Staffing Services for about 11 months starting in 2021, according to the complaint. Velez spent about seven months working as a TikTok moderator for Canada-based Telus International, the same firm that employed Frazier. Atrium and Telus did not immediately respond to requests for comment.

Although they worked for two different companies, the complaint states that Velez and Young “performed the same tasks, in the same way, using applications provided by” TikTok and ByteDance, and that the social media giant set quotas, monitored and disciplined the moderators.

The lawsuit — which seeks approval as a class action — alleges that the moderators were exposed to disturbing content, including “a thirteen-year-old child being executed by cartel members” and “bestiality and necrophilia.” They also faced “repeated exposure” to fringe beliefs and conspiracy theories such as claims that the Covid-19 pandemic is a fraud, Holocaust denial and manipulated videos of elected officials, according to the complaint.

The complaint claims that because of the large volume of videos moderators must review, they often had fewer than 25 seconds to review each video and would view multiple videos simultaneously. Moderators are offered two 15-minute breaks and an hour-long lunch for each 12-hour workday, but ByteDance withholds payment to moderators if they are not on the moderation platform for any other time during the day, it alleges.

The lawsuit also accuses the company of failing to implement safeguards for moderators, such as blurring or changing the color of some disturbing videos, and of reducing the “wellness” time offered to moderators from one hour to 30 minutes each week.

“As a result of constant and unmitigated exposure to highly toxic and extremely disturbing images at the workplace, [Young and Velez] have suffered immense stress and psychological harm,” the complaint states. “Plaintiffs have sought counseling on their own time and effort due to the content they were exposed to.”

Theo Bertram, then-TikTok’s director of public policy for Europe, the Middle East and Africa, told British lawmakers in September 2020 that the company had 10,000 people worldwide on its “trust and safety” team, which oversees content moderation policies and decisions. TikTok last year also launched an automated moderation system to scan and remove videos that violate its policies “upon upload,” although the feature is only available for certain content categories.

The system handles “content categories where our technology has the highest degree of accuracy, starting with violations of our policies on minor safety, adult nudity and sexual activities, violent and graphic content, and illegal activities and regulated goods,” a July blog post from TikTok’s Head of US Safety, Eric Han, reads. “We hope this update also supports resiliency within our Safety team by reducing the volume of distressing videos moderators view and enabling them to spend more time in highly contextual and nuanced areas.”

Still, the Thursday complaint states that more than 81 million videos were removed from TikTok in the second quarter of 2021— a figure TikTok reported in February — and alleges that most were removed by human content moderators rather than automated tools.

The suit also alleges that the moderators were forced to sign non-disclosure agreements as part of their jobs, which forced “them to keep inside the horrific things they see while reviewing content.” The claims in Thursday’s lawsuit are consistent with allegations made in Frazier’s earlier lawsuit.

Thursday’s lawsuit seeks to have TikTok and ByteDance fund a medical monitoring program to help diagnose and treat moderators’ mental health conditions, as well as other yet unspecified financial damages.

The-CNN-Wire
™ & © 2022 Cable News Network, Inc., a WarnerMedia Company. All rights reserved.

Article Topic Follows: CNN - Social Media/Technology

Jump to comments ↓

CNN Newsource

BE PART OF THE CONVERSATION

KION 46 is committed to providing a forum for civil and constructive conversation.

Please keep your comments respectful and relevant. You can review our Community Guidelines by clicking here

If you would like to share a story idea, please submit it here.

Skip to content