Published , by TJ Denzer
Published , by TJ Denzer
The internet can be an absolutely terrible place. In the darkest corners of free information and content-sharing platforms like Facebook, there are nearly always groups attempting to share the platforms for unspeakably offensive content. To that end, there are moderators of that content who have to deal with the trauma of regularly handling it under NDA. However, Facebook’s moderators have spoken out against NDAs they feel are too strict without compensation to warrant them.
It was recently that content moderators in both the USA and Europe wrote a joint letter to Facebook CEO Mark Zuckerberg, fellow executive Sheryl Sandberg, and the CEOs of Covalen and Accenture, as reported by The Verge. The letter claims that the NDAs they are under go beyond protection of user data and promote a damaging culture of “excessive secrecy.” This is in addition to ongoing debate over the longterm effects of moderating content and the lack of proper compensation or care for those effects. In May, Facebook moderator Isabella Plunkett shared that while the company supplies life coaches to aid with the stress, it’s not enough to cover the mental trauma.
“These people mean really well, but they are not doctors,” Plunkett said in testimony.
The letter written to Facebook asks for proper access to psychiatric and psychologist care, as well as proper integration into the Facebook company with fitting pay and benefits. Often operating as contractors and third-party employees, the moderators feel they are not offered proper health and safety measures for the rigors of the job.
“Imagine watching hours of violent content or children abuse online as part of your day to day work,” the letter states. “You cannot be left unscathed. This job must not cost us our mental health.”
Representatives from Ireland, Portugal, Spain, and the United States provided around 60 signatures to the letter alongside UK nonprofit tech justice group Foxglove. However, Facebook pushed back against the idea that moderators are not offered proper mental health care access for their services.
“We recognize that reviewing content can be a difficult job, which is why we work with partners who support their employees through training and psychological support when working with challenging content,” a Facebook spokesperson said. “In Ireland, this includes 24/7 on-site support with trained practitioners, an on-call service, and access to private healthcare from the first day of employment. We also use technology to limit their exposure to graphic material as much as possible.”
Despite this, it's not the first time the rigors of content moderation and the adequate compensation (or lack thereof) have come up. It was in January 2020 that YouTube was discovered to be issuing documents that acknowledged possible PTSD from mental stress and trauma associated with content moderation as part of its NDA. In said case, Accenture also came up then. Neither Accenture, nor Covalen shared any statement in this more recent instance.
As content moderation continues to be a need in social and content sharing platforms, the question of just what it’s worth to do that kind of work continues to be a tough one. For the moderators who are on the frontline seeing the worst of what Facebook content has to offer, it’s arguably not enough. Stay tuned as we continue to follow this story for further updates and information.