Former YouTube content moderator describes horrors of the job in new lawsuit

Catch up with NBC News Clone on today's hot topic: Former Youtube Content Moderator Describes Horrors Job New Lawsuit Rcna134 - Technology and Innovation | NBC News Clone. Our editorial team reformatted this story for clarity and speed.

The plaintiff said she experienced nightmares, panic attacks and the inability to attend crowded areas as a result of the violent content she viewed.
Image: YouTube Unveils A Virtual Cable Subscription
YouTube signage.Rick T. Fallon / Bloomberg via Getty Images file

A former YouTube moderator sued YouTube on Monday, accusing it of failing to protect workers who have to catch and remove violent videos posted to the site. 

The suit says the plaintiff was required to watch murders, abortions, child rape, animal mutilation, and suicides. YouTube parent company Google faces increasing pressure to control content spanning violence and misinformation — particularly as it approaches the 2020 U.S. election and federal investigations.

The plaintiff, who’s referred to as “Jane Doe,” worked as a YouTube content moderator for staffing contracting firm Collabera from 2018 to 2019 and experienced nightmares, panic attacks and inability to attend crowded areas as a result of the violent content she viewed while working for the company, the lawsuit says.

YouTube’s “Wellness Coaches” weren’t available for people who worked evening shifts and were not licensed to provide professional medical guidance, the suit says. It also alleges moderators had to pay for their own medical treatment when they sought professional help.

Neither YouTube nor Collabera immediately responded to request for comment.

The suit says many content moderators remain in position for less than a year and that the company is “chronically understaffed,” so moderators end up working overtime and exceeding the company’s recommended four-hour daily viewing limit. Despite the demands of the job, moderators had little margin for error, the suit states.

The company expects each moderator to review between 100 and 300 pieces of video content each day with an “error rate” of two to five percent, the suit claims. The companies also control and monitor how the videos are displayed to moderators: whether in full-screen versus thumbnails, blurred and how quickly they watch in sequence. 

The suit comes as moderators for social media companies speak out on the toll the job takes on their mental health. YouTube has thousands of content moderators and most work for third-party vendors including Collabera, Vaco and Accenture. Joseph Saveri Law Firm, a San Francisco-based firm representing the plaintiffs, filed a similar lawsuit against Facebook that resulted in $52 million settlement in May.

It also comes as Google-owned YouTube has reportedly reverted back to humans to find and delete content after it relied on computers to automatically sift through videos during the pandemic. It switched because computers were censoring too many videos that didn’t violate any rules.

×
AdBlock Detected!
Please disable it to support our content.

Related Articles

Donald Trump Presidency Updates - Politics and Government | NBC News Clone | Inflation Rates 2025 Analysis - Business and Economy | NBC News Clone | Latest Vaccine Developments - Health and Medicine | NBC News Clone | Ukraine Russia Conflict Updates - World News | NBC News Clone | Openai Chatgpt News - Technology and Innovation | NBC News Clone | 2024 Paris Games Highlights - Sports and Recreation | NBC News Clone | Extreme Weather Events - Weather and Climate | NBC News Clone | Hollywood Updates - Entertainment and Celebrity | NBC News Clone | Government Transparency - Investigations and Analysis | NBC News Clone | Community Stories - Local News and Communities | NBC News Clone