New Delhi, Oct 21 (IANS) Amid the surge in Covid-19 cases in India, third-party firms working for Facebook allegedly pressurised the content moderators to get back to work, a report from the non-profit publication ‘Rest of World’ has claimed.
In Hyderabad, at least 1,600 people are employed by global professional services firm Genpact to do content moderation for Facebook.
“This summer, even as Covid-19 cases were surging in India, Genpact moderators said they felt pressured by their employer to come back to the office,” the report said on Tuesday.
“While most of Facebook’s full-time employees remain safe at home, these workers have been forced to choose between their health and their livelihoods,” it claimed.
In a statement shared with IANS, a Facebook spokesperson said: “Our focus for reopening any office is on how it can be done in a way that keeps our reviewers safe”.
“To do this we are putting strict health and safety measures in place, making sure they’re followed, and addressing and disclosing any confirmed cases of illness,” the spokesperson added.
Rest of World said it spoke with four current and former Genpact employees.
“They said moderators were asked — in some cases as early as July — to return to the office to tackle sensitive content, including posts involving child exploitation, suicide, and other matter that could lead to real-world harm,” the report mentioned.
Genpect said in a statement given to the publication that it asserted that moderators are being asked to come to the office only on a volunteer basis.
“To make this manageable, safe and clear, employees need to sign a weekly form that asks them to voluntarily agree to this,” a company spokesperson told Rest of World.
The report quoted a senior content moderator as saying that Genpact employees were informed they could lose their jobs if they didn’t come to the office.
“The operations team told them these are important orders,” said the moderator. “There’s a threatening factor behind (it).”
Facebook had more than 30,000 employees working on safety and security — about half of whom were content moderators.
The social networking giant in May agreed to pay $52 million to third-party content moderators who developed post-traumatic stress disorder (PTSD) and other mental health issues as they scanned scores of disturbing images of rape, murder and suicide to curb those on the platform.
According to The Verge, in a preliminary settlement in San Mateo Superior Court, the social networking giant agreed to pay damages to 11,250 US-based moderators and provide more counselling to them.
Facebook has hired several firms like Accenture, Cognizant, Genpact and ProUnlimited to help it moderate and remove harmful content in the aftermath of the 2016 US presidential election and Cambridge Analytica data scandal.
Last year, several moderators told The Verge that they had been diagnosed with PTSD after working for Facebook.
Cognizant later announced that it would quit the content moderation business and shut down its sites earlier this year. The company also developed “the next generation of wellness practices” for affected content moderators.