Content moderator TikTok sues, says she suffers from PTSD because of her job

  • A TikTok content moderator is suing the company, alleging it did not provide a safe workplace.
  • Frazier’s attorney said TikTok won’t let her work and it’s not clear whether she will be paid.
  • Facebook paid $ 52 million in a class action settlement to moderators diagnosed with PTSD.

TikTok prevents a content moderator from working after suing the company, alleging that she did not provide a safe workplace and that her job caused her post-traumatic stress disorder (PTSD), her lawyer said.

Candie Frazier’s class action, was first reported by The edge, accuses TikTok and its parent company, ByteDance, of injuring content moderators by causing them to engage in abnormally dangerous activity, according to the lawsuit, obtained by Insider.

Frazier’s attorney, Steve Williams, alleged to Insider that TikTok disciplined her “in retaliation” by banning her from working a day after filing a complaint. Williams said it was “not clear” whether Frazier would be paid for the missed work.

The lawsuit, filed Thursday in the United States District Court for the Central District of California, said Frazier’s job included spending 12 hours a day reviewing “disturbing” content. During his work, Frazier has witnessed “thousands of acts of extreme and graphic violence,” including child rape, animal mutilation, sexual assault and mass shootings, according to the prosecution.

Due to “constant, unmitigated exposure to highly toxic content,” Frazier developed PTSD, anxiety, and


depression

, the costume alleges.

The costume also states that Frazier has “horrible nightmares” and often plays back “videos she’s seen in her head” while trying to sleep.

While it is aware of how damaging the work of content moderation can be, the lawsuit alleges that ByteDance and TikTok have failed to implement industry-wide safety standards, such as the disabling audio, reducing or blurring parts of disturbing content. The lawsuit also alleges that companies are not providing adequate mental health support.

Content moderators watch three to 10 videos at the same time and don’t spend more than 25 seconds on a single video due to the “sheer volume” of TikTok content, depending on the combination.

ByteDance and TikTok can oversee content moderators through an exclusive video review program to verify that they “strictly adhere to breaks,” the lawsuit alleges, and the company can “decline payment” if an employee uses more than the allotted break time.

A spokesperson for TikTok declined to comment on the matter, but told Insider that the platform strives “to promote a caring work environment for our employees and contractors” and continues “to expand a range of wellness services’ for the mental and emotional support of moderators.

When asked about Williams’ claim that Frazier was reprimanded and unable to work as a result of the trial, TikTok was not immediately available for comment.

Frazier is employed by Telus International, which hires content moderators for TikTok, and has been a “top-level content moderator” since January 2018, the lawsuit says. Telus International, which is not named as a defendant in the lawsuit, did not respond to a request for comment.

Frazier is seeking a jury trial and compensation for herself and other US content moderators who have been exposed to disturbing videos and images on TikTok, and for TikTok and ByteDance to provide support and treatment in relation to the issue. mental health to content moderators.

Content moderators alleging that they have developed psychological issues as a result of their work have been a major issue on other social media platforms. Facebook paid $ 52 million in a 2020 settlement to moderators diagnosed with PTSD as a result of their work, which The Verge called “historic recognition of the toll content moderation takes on its workforce.”

About Jessica J. Bass

Check Also

NASA lands toughest job on newly launched space telescope

NASA did the most complicated and critical job on its newly launched space telescope on …