5 min read

An ex-employee filed a lawsuit against Facebook, last week, alleging that Facebook is not providing enough protection to the content moderators whose job involve reviewing disturbing content on the platform.

Why is Selena Scola, a content moderator, suing Facebook?

“Plaintiff Selena Scola seeks to protect herself and all others similarly situated from the dangers of psychological trauma resulting from Facebook’s failure to provide a safe workplace for the thousands of contractors who are entrusted to provide the safest environment possible for Facebook users”, reads the lawsuit.

Facebook receives millions of videos, images, and broadcast posts of child sexual abuse, rape, torture, bestiality, beheadings, suicide, and murder. In order to make Facebook a safe platform for users, it relies on machine learning augmented by content moderators. This ensures that any image that violates the corporation’s term of use is removed completely from the platform, as quickly as possible.

Facebook’s content moderators are asked to review more than 10 million potentially rule-breaking posts per week. Facebook aims to do this with an error rate of less than one percent, and seeks to review all user-reported content within 24 hours”, says the lawsuit.

Although this safeguard helps with maintaining the safety on the platform, content moderators witness thousands of such extreme content every day. Because of this constant exposure to disturbing graphics, content moderators go through a lot of trauma, with many ending up developing Post-traumatic stress disorder (PTSD), highlights the lawsuit.

What does the law say about workplace safety?

Facebook claims to have a workplace safety standards draft already in place, like many other tech giants, to protect content moderators. They say it includes providing moderators with mandatory counseling, mental health supports, altering the resolution, and audio, of traumatizing images. It also aimed to train its moderators to recognize the physical and psychological symptoms of PTSD. We have, however, found it difficult to locate the said document.

However, as per the lawsuit, “Facebook ignores the workplace safety standards it helped create. Instead, the multibillion-dollar corporation affirmatively requires its content moderators to work under conditions known to cause and exacerbate psychological trauma”. This is against the California law which states,

“Every employer shall do every other thing reasonably necessary to protect the life, safety, and health of Employees. This includes establishing, implementing, and maintaining an effective injury prevention program. Employers must provide and use safety devices and safeguards reasonably adequate to render the employment and place of employment safe”.

Facebook hires content moderators on a contract basis

Tech giants such as Facebook generally have a two-level workforce in place. The top level comprises Facebook’s official employees such as engineers, designers, and managers. These enjoy the majority of benefits such as high salary, and lavish perks among others.

Employees such as Content moderators come under the lower level. Majority of these workers are not even permanent employees at Facebook, as they’re employed on a contract basis. Because of this, they often get paid low, miss out on the benefits that regular employees get, as well as have limited access to Facebook management.

One of the employees, who wished to remain anonymous told the Guardian last year, “We were underpaid and undervalued”. He earned roughly $15 per hour. This was for removing terrorist related content from Facebook, after a two-week training period.

They usually come from a poor financial background, with many having families to support. Taking up a job as opposed to being unemployed seems to be a better option for them. Selena Scola was employed by Pro Unlimited (a contingent labor management company in New York) as a Public Content Contractor from approximately June 19, 2017, until March 1, 2018, at Facebook’s offices in Menlo Park and Mountain View, California. During the entirety of this period, Scola was employed solely by Pro Unlimited, an independent contractor of Facebook. She had never been directly employed by Facebook in any capacity. Scola is also suing Pro Unlimited.

“According to the Technology Coalition, if a company contracts with a third-party vendor to perform duties that may bring vendor employees in contact with graphic content, the company should clearly outline procedures to limit unnecessary exposure and should perform an initial audit of a contractor’s wellness procedures for its employees,” says the lawsuit.

Scola is not the only one who has complained about the company. Over a hundred conservative Facebook employees formed an online group to protest against the company’s “intolerant” liberal culture, last month. The mass exodus of high profile executives is also indicative of a deeper people and a cultural problem at Facebook.

Additionally, Facebook has been in many controversies regarding user’s data, fake news, and hate speech. The Department of Housing and Urban Development (HUD) had filed a complaint against Facebook last month, for selling ads which discriminate against users on the basis of race, religion, and sexuality. Similarly, Facebook was found guilty of discriminatory advertisements. Apparently, Facebook provided the third-party advertisers with an option to exclude religious minorities, immigrants, LGBTQ individuals, and other protected groups from seeing their ads.

Given the increasing number of controversial cases against Facebook, it’s high time for the company to take the right measures towards solving these issues. The lawsuit is currently Scola v Facebook Inc and Pro Unlimited Inc, filed in Superior Court of the State of California.

For more information, read the official lawsuit.

Read Next

How far will Facebook go to fix what it broke: Democracy, Trust, Reality

Facebook COO, Sandberg’s Senate testimony: On combating foreign influence, fake news, and upholding election integrity

Time for Facebook, Twitter and other social media to take responsibility or face regulation

Tech writer at the Packt Hub. Dreamer, book nerd, lover of scented candles, karaoke, and Gilmore Girls.