A TikTok moderator has filed a lawsuit alleging “psychological harm”
A former TikTok moderator is suing the firm, alleging that it failed to protect her mental health after she was exposed to disturbing video content continuously.
Candie Frazier claims she spent up to 12 hours a day watching films with “severe and graphic violence.”
She claims to be dealing with severe psychological trauma, such as anxiety, sadness, and post-traumatic stress disorder.
TikTok claims to work hard to foster a friendly working atmosphere.
TikTok reported in September that 1 billion people use the app each month. According to Cloudflare, an IT security firm, it currently gets more hits than Google.
Thousands of in-house and contract content moderators screen out videos and accounts that violate the video-sharing platform’s guidelines to protect its users.
Ms Frazier has filed a lawsuit against TikTok and its parent firm, Bytedance, a Chinese tech giant.
She alleges she witnessed terrible things as a moderator, including recordings of sexual assault, cannibalism, genocide, mass shootings, child sexual abuse, and animal mutilation.
Ms Frazier, who worked for Telus International, a third-party contractor, claims she was asked to evaluate hundreds of movies per day.
Ms Frazier suffered severe psychological distress, including anxiety, depression, and post-traumatic stress disorder, as a result of the material she was obliged to peruse, according to a lawsuit filed in federal court in California last week.
While she was not a TikTok employee, the lawsuit argues that the social media giant controlled the method and manner in which content filtering happened.
Ms Frazier claims that to keep up with the amount of footage she was expected to review, she had to watch up to ten films at once.
According to the lawsuit, during a 12-hour shift, moderators were given a 15-minute break after the first four hours of work, and then another 15-minute break every two hours after that. A one-hour lunch break was also included.
It claims TikTok failed to follow industry standards aimed at reducing the impact of content filtering and that the company broke California labour law by failing to provide a safe working environment.