Facebook has been in news in recent years for its data leaks and data privacy concerns. This time the company is on the radar because of the deplorable working conditions of content moderators. The reviewers are so much affected by the content on the platform that they are trying to overcome their PTSD by having sex and getting into drugs at work, reports The Verge in a compelling and horrifying insight into the lives of content moderators who work as contract workers at Facebook’s Arizona office.
Last year there was a similar report against Facebook. An ex-employee had filed a lawsuit against Facebook, in September for not providing enough protection to the content moderators who are responsible for reviewing disturbing content on the platform.
The platform has millions of videos, images of child sexual abuse, rape, torture, bestiality, beheadings, suicide, and murder. The platform relies on machine learning augmented by human content moderators to keep the platform safe for the users. This means any image that violates the corporation’s terms of use is removed from the platform.
In a statement to CNBC, a Facebook spokesperson said, “We value the hard work of content reviewers and have certain standards around their well-being and support. We work with only highly reputable global partners that have standards for their workforce, and we jointly enforce these standards with regular touch points to ensure the work environment is safe and supportive, and that the most appropriate resources are in place.”
The company has also posted a blog post about its work with its partners like Cognizant and its steps towards ensuring a healthy working environment for content reviewers.
As reported by The Verge, the contracted moderators get one 30-minute lunch, two 15-minute breaks, and nine minutes of “wellness time” per day. But much of this time is spent waiting in queues for the bathroom where three stalls per restroom serve hundreds of employees.
Facebook’s environment is such that workers cope with stress by telling dark jokes about committing suicide, then smoke weed during breaks to numb their emotions. According to the report, it’s a place where employees can be fired for making just a few errors a week. Even the team leaders give a hard time to the content moderators by micromanaging their bathroom and prayer break.
The moderators are paid $15 per hour for moderating content that could range from offensive jokes to potential threats to videos depicting murder.
A Cognizant spokesperson said, “The company has investigated the issues raised by The Verge and previously taken action where necessary and have steps in place to continue to address these concerns and any others raised by our employees. In addition to offering a comprehensive wellness program at Cognizant, including a safe and supportive work culture, 24×7 phone support and onsite counselor support to employees, Cognizant has partnered with leading HR and Wellness consultants to develop the next generation of wellness practices.”
Public reaction to this news is mostly negative with users complaining and condemning how the company is being run.
After mulling over this fine piece today, and thinking about all of @facebook’s offenses, lies and blunders, I’ve concluded that the company is a mix of the supposedly missing features of the Scarecrow, the Tin Man and the Cowardly Lion. https://t.co/zCvtYHbPOQ
— Walt Mossberg (@waltmossberg) February 26, 2019
I never thought about it. Just assumed it was automated in the back of my head. Couldn’t imagine a human being sitting through the filth having to press the buttons..
— C9 Hawks 鹰巢 (@HawksNest) February 25, 2019
Imagine how the poor robots at Youtube felt when I mass flagged videos of women stepping on snails.
— Lieutenant BaconWaffles (@likalaruku) February 26, 2019
This is a strawman. As @sivavaid has explained, Facebook’s impossible moderation problem is of its own making. The firehose of the worst humanity has to offer is not inevitable; it’s a result of Facebook’s architecture. Journalists are 100% right to call out both problems. https://t.co/hP2hwOb5h8
— Blake Reid👨🏻💻 (@blakereid) February 25, 2019
People are angry with the fact that the content moderators at Facebook endure such trauma in their role. Some believe some compensation should be given to those suffering from PTSD as a result of working in certain high-stress roles in companies across industries.
It took me half a pound of pretzels to make it through the FB moderation article and my main conclusion is that we should all be fighting to get occupational PTSD covered under workers comp, it's frankly wild to me that it isn't in most states/jobs?? https://t.co/oLJRYrGuIx
— Leigh Honeywell (@hypatiadotca) February 26, 2019
According to Kevin Collier, a Cyber reporter, Facebook is underpaying and making content moderators overwork in a desperate attempt to reign in abuse of the platform it created.
Truly, both sides are to blame here. Facebook for underpaying and overworking content moderators in a desperate attempt to reign in abuse of the platform it created. And journalists for pointing that out.
— Kevin Collier (@kevincollier) February 25, 2019
One of the users tweeted, “And I’ve concluded that FB is run by sociopaths.” Youtube has rolled out a feature in the US that displays notices below videos uploaded by news broadcasters which receive government or public money. Alex Stamos, former Chief Security Officer at Facebook, highlighted something similar but with reference to Facebook. According to him, Facebook needs a state-sponsored label and people should know the human cost of policing online humanity.
FB needs a state-sponsored label that will follow each post that BBC/PBS/A(u)BC/CBC won't object to. Fortunately, YouTube has opened that door.
— Alex Stamos (@alexstamos) February 25, 2019
To know more about this news, check out the report by The Verge.
Read Next
Ex-employee on contract sues Facebook for not protecting content moderators from mental trauma
NIPS 2017 Special: Decoding the Human Brain for Artificial Intelligence to make smarter decisions