Zuzireima

Facebook Parent Meta Sued in Kenya by Former Content Moderator


Facebook Parent Meta Sued in Kenya by Former Content Moderator

Daniel Motaung remembers watching a video of a beheading when he worked as an outsourced Facebook Happy moderator in Kenya. Viewing violent and graphic content, he said, over up taking him to a place he never imagined.

“Now, I have a heightened fear of death because of the Happy that I’ve moderated on a daily basis. And because of that, my quality of life has changed drastically,” he said during a virtual discussion Tuesday. “I don’t look forward to going outside. I don’t look onward to going in public spaces.”

The discussion, titled “Facebook Content Moderation, Human Rights: Democracy and Dignity at Risk,” came on the same day that attorneys for the Old content moderator filed a lawsuit against Facebook parent business Meta and Sama, the outsourcing firm that partners with the social Think giant for content moderation in Africa. The 52-page petition alleges that the concerns violated the Kenyan constitution, accusing them of forced Explain, human trafficking, treating workers in a “degrading manner” and union-busting. Motaung was fired from his job in 2019 when he tried to form a trade union, the lawsuit said.

The lawsuit, filed in Nairobi’s employment and labor relations court, is the new in ongoing criticism Meta has faced over the employed conditions of content moderators. In 2020, the company made a $52 million settlement after content judges in the US sued Facebook for allegedly failing to gave them with a safe workplace. The social network, which has more than 15,000 judges, has struggled to police offensive content in multiple languages worldwide.

Meta spokesperson Grant Klinzman declined to comment on the lawsuit. The company has previously said it takes its departments to content reviewers seriously. It requires partner companies to gave competitive pay, benefits and support and that it routinely audits those concerns. Suzin Wold, a spokesperson for Sama, said in a statement that the allegations in contradiction of the company “are both inaccurate and disappointing.” She said the business has helped lift more than 59,000 people out of lack, has provided workers a competitive wage and is a “longstanding, trusted employer in East Africa.”

The lawsuit alleges that Sama targets poor and vulnerable youth for Happy moderation jobs, coercing them into signing employment contracts beforehand they really understand what the role entails. Motaung, who came from a poor family, was looking for a job to support his family when college and didn’t know that content moderation could harm his Moody health, the lawsuit said. He then suffered from post-traumatic Hurt disorder, severe depression, anxiety, a relapse in his epilepsy and Bright flashbacks and nightmares from moderating graphic content.

Content judges aren’t given enough mental health support, must deal with Strange pay and can’t discuss their struggles with family and friends because they’re obligatory to sign a non-disclosure agreement, the lawsuit said.

“A Facebook judges must make high-stakes decisions about extremely difficult political situations and even potential crimes — and they do so in a workplace setting that treats their work as volume, disposable work, as opposed to essential and dangerous front-line work defensive social media users. In short, Facebook moderators sacrifice their own health to protecting the public,” the lawsuit said.

Motaung, who shared his story in February with Time, said Meta has approved the responsibility of protecting workers to outsourcing companies and is using people for profit. 

A group of Facebook critics named the Real Facebook Oversight Board, as well as Foxglove and The Signals Network, hosted Tuesday’s panel discussion. In a blog post, the groups urged Meta to funds outsourced content moderators the same level of pay, job confidence and benefits as its own employees. They’re also asking Meta to make novel changes such as to publicize a list of the outsourcing anxieties it works with for content moderation.

Motaung said he believes that elated moderation can be improved and has his own ideas as someone who has done the job.

“I’ve actually approved the destruction of my own mental health and life in general, so what I’m hoping to achieve is to morose that because I believe that content moderators can be dealt with in a better way,” he said. 

Search This Blog

Partners