In November 2018 I received a message that changed my life. A person working as a moderator for Facebook in Phoenix through a company called Cognizant asked to get on the phone and talk about some of what he was seeing there.
Photo by Michele Doying / The Verge In November 2018 I received a message that changed my life. A person working as a moderator for Facebook in Phoenix through a company called Cognizant asked to get on the phone and talk about some of what he was seeing there. His experiences shocked me, and after I wrote about what he and his colleagues were going through in The Verge , they would go on to shock a lot more people.
It was an office where moderators would have panic attacks while still in training, traumatized by daily exposure to gore and other disturbing posts. Where ever-shifting content policies, and demands for near-perfect accuracy, could make the job itself impossible. And where months of sifting through conspiracy theories led some moderators to embrace fringe viewpoints, walking through the building insisting that the earth is flat.
I wrote about the experiences of a dozen current and former moderators at the Phoenix site last February . A few months later, after hearing from employees that conditions at Cognizant’s Tampa site were even more grim, I traveled there and talked to a dozen more workers . There I learned of a stressed-out moderator who died of a heart attack at his desk at the age of 42. I learned of multiple sexual harassment suits that had been filed against various workers at the site. And I met three brave former moderators who violated their non-disclosure agreements to describe their working conditions on camera.
By then a lawsuit by a former moderator named Selena Scola, which accused Facebook of creating an unsafe workplace that had caused her mental health problems, was working its way through the courts. And on Friday, lawyers filed a preliminary settlement in the case. I wrote about it today at The Verge :
In a landmark acknowledgment of the toll that content moderation takes on its workforce, Facebook has agreed to pay $52 million to current and former moderators to compensate them for mental health issues developed on the job. In a preliminary settlement filed on Friday in San Mateo Superior Court, the social network agreed to pay damages to American moderators and provide more counseling to them while they work.
Each moderator will receive a minimum of $1,000 and will be eligible for additional compensation if they are diagnosed with post-traumatic stress disorder or related conditions. The settlement covers 11,250 moderators, and lawyers in the case believe that as many as half of them may be eligible for extra pay related to mental health issues associated with their time working for Facebook, including depression and addiction.
“We are so pleased that Facebook worked with us to create an unprecedented program to help people performing work that was unimaginable even a few years ago,” said Steve Williams, a lawyer for the plaintiffs, in a statement. “The harm that can be suffered from this work is real and severe.”
After a year of reporting on the lives of these moderators — I also profiled people who do the work for Google and YouTube — it seemed clear to me that some percentage of people who work as moderators will suffer long-term mental health consequences. But what is that percentage?
Last year I published some leaked audio from a Facebook all-hands meeting in which CEO Mark Zuckerberg acknowledged this range of experiences. “Within a population of 30,000 people, there’s going to be a distribution of experiences that people have,” Zuckerberg said, referring to the number of people Facebook has working on trust and safety issues around the world. “We want to do everything we can to make sure that even the people who are having the worst experiences, that we’re making sure that we support them as well as possible.”
One of the most interesting aspects of today’s news is that it begins to answer the question of how many moderators are affected. Designing a settlement required that lawyers for Facebook and the plaintiffs estimate how many people would make claims. And the number is much higher than I had imagined.
This lawsuit covers only people who have worked for Facebook through third-party vendors in the United States from 2015 until today, a group whose size is estimated to be 11,250 people. (A similar lawsuit is still pending in Ireland covering European workers.) Both Facebook and plaintiffs’ lawyers consulted with experts in post-traumatic stress and vicarious trauma. Based on those discussions, a lawyer for the plaintiffs told me, as many as half of the members of the class are expected to qualify for additional payments.
In other words, if you become a moderator for Facebook, a legal precedent suggests you have a one in two chance of suffering negative mental health consequences for doing the work.
Perhaps those odds will come down as Facebook implements some of the other changes they agreed to in the settlement, such as providing more counseling and offering workers tools to adjust the content that they’re viewing — turning it black and white, turning off audio by default, and so on. But the risk to human lives is real, and it’s not going away.
Another aspect to consider: how much will the average moderator get paid as a result of the settlement? The $52 million figure is less impressive when you consider that fully 32.7 percent of it has been earmarked for the lawyers in the case, leaving $35 million left over for everyone else.
The settlement was designed to compensate moderators in tiers. The first tier grants $1,000 to everyone, in the hopes that moderators use the money to get a mental health checkup from a doctor. For those who are either newly diagnosed or already have diagnoses, the settlement provides an additional $1,500 to $6,000 based on the severity of their cases. And then moderators can also submit evidence of distress suffered as a result of their work to win up to $50,000 per person in damages.
The sums could all be much smaller depending on how many members of the class apply and are found eligible for benefits beyond the first $1,000. If half the class were found eligible for additional mental health benefits and received equal compensation — which will not be the case but may be useful for ballpark-estimation purposes — there would be $4,222.22 available per moderator.
In my Twitter replies, lots of folks objected to the size of the payment, arguing it should have been much higher. Here, for example, is person who called the settlement “a day’s worth of random market fluctuation in profits for Facebook.” I won’t argue here — many of these content moderation roles are essentially first responder jobs, not nearly as different as you might think from police officers and paramedics, and they deserve compensation and benefits more closely in line with the service they provide and the risks that they take.
I called up Shawn Speagle, a former Facebook moderator who worked at the Tampa site, to tell me what he thought. Speagle, who was not involved in the lawsuit, worked for Cognizant from March to October 2018. During that time, he was exposed to videos of extreme violence and animal abuse on a near-daily basis, and he began to overeat and experience night terrors. After being fired, he was diagnosed with PTSD.
He said that a year of psychiatric care had helped him significantly with his symptoms, but also that the things he had seen continue to haunt him. “It’s been a very long ride,” Speagle told me Tuesday. “It’s been very difficult to forget about a lot of that stuff. You never do — it just sticks with you forever. Even though it was just seen over a screen, those lives are never coming back. I just wish that Facebook would recognize that.”
Speagle said that he sometimes felt embarrassed describing his PTSD to others, worrying they wouldn’t quite believe a person could develop the condition by reviewing Facebook posts. “There were a lot of times when it was humiliating,” he said. But psychiatrists helped him to understand that the phenomenon known as vicarious trauma — watching others experience pain — is real, and can be dangerous. He has since become a public school teacher.
I asked him what he thought of the payout he might now be eligible for.
“I would be fine getting no money,” Speagle told me. “I just wanted to bring this forward. When I did the job at Facebook, I was told I was making the world a better place for animals and young people. The reason I came forward was to stick to true to that. Money and a lawsuit have nothing to do with what I did.”
Comments