Facebook‘s efforts to keep its communities safe may be coming at the price of its moderators, as one former contractor is suing the company for ignoring workplace safety standards, leaving her with post-traumatic stress disorder.
According to the suit filing via Mashable, former content manager Selena Scola alleges that her nine-month job beginning in June 2017 required “constant and unmitigated exposure to highly toxic and extremely disturbing images.” As a result, she says she eventually began experiencing fatigue, insomnia, and social anxiety. She has since been formally diagnosed with PTSD.
She says her symptoms can be trigged if she “touches a computer mouse, enters a cold building, watches violence on television, hears loud noises, or is startled,” or “when she recalls or describes graphic imagery she was exposed to as a content moderator,” according to the suit.
Scola was employed by Pro Unlimited Inc. and contracted out to Facebook. As a result, her lawyer Korey Nelson says Facebook supports a “revolving door of contractors who are irreparably traumatized by what they witnessed on the job” instead of providing a “safe workplace” and addressing the contractors’ work conditions. Facebook content moderators consist of full-time employees, contractors, and companies.
- The delicate balance of disclosing mental illness on social media
- How to be less socially awkward, according to mental health professionals
- 10 essential self-care tips in the wake of #MeToo
Bertie Thomson, Facebook’s director of corporate communications, told Mashable that the company is reviewing the claim, and alleges that it takes “the support of our content moderators incredibly seriously.” Thomson said that these issues are addressed in training and benefits, and that it ensures that “every person reviewing Facebook content is offered psychological support and wellness resources.”
Scola’s other attorney, Steve Williams, told Mashable in a statement that Scola ultimately wants Facebook to create a medical monitoring fund for content moderators that provides PTSD testing and treatment.
“Facebook needs to mitigate the harm to content moderators today and also take care of the people that have already been traumatized,” Williams said.