Facebook is Paying Content Moderators $52 Million for Psychological Damage

As a result of a recent lawsuit, Facebook has agreed to pay $52 million to the content moderators as compensation for mental health issues developed by the job. The announcement comes after a preliminary settlement filed on Friday in San Mateo Superior Court.

For the uninitiated, a Facebook moderator is tasked with reviewing media to review and flag gruesome content, which exposes the moderator to all manner of inappropriate and sometimes scarring content.

Each former and current moderator will receive a minimum of $1,000. They will be eligible for additional compensation in case they are diagnosed with post-traumatic stress disorder or any other psychological condition.

Currently, the lawsuit covers around 11,250 moderators and approximately, 50% of them are expected to have developed conditions that require extra compensation. These conditions include addiction and depression. The lawsuit details that moderators and ex-moderators who are diagnosed with a mental health condition are eligible for an additional $1,500, and those with concurrent diagnoses like PTSD can be eligible for up to $6000.

Facebook has been hiring moderators for years in order to prevent graphic images/videos of terrible acts, murder, and suicide circulating on the website. However, over the years, this has taken a toll on the mental health of people who were skimming the content for us.

Steve Williams, a lawyer for the plaintiffs, said in a statement:

We are so pleased that Facebook worked with us to create an unprecedented program to help people performing work that was unimaginable even a few years ago. The harm that can be suffered from this work is real and severe.

Facebook, in its all capacity, is ready to compensate for any and all damage done to moderators’ mental health during work. The social media giant said in a statement that it was:

Grateful to the people who do this important work to make Facebook a safe environment for everyone. We’re committed to providing them additional support through this settlement and in the future.

It is disappointing to see that a lawsuit was required to awaken Facebook’s “gratefulness and commitment to additional support” towards its old and new employees.

Fortunately, artificial intelligence is increasingly removing the need for workers to view distressing content.



>