Meta found liable as court blocks firing of moderators
A Kenyan court has ruled that Meta is the primary employer of content moderators suing the social media giant and its content review partner in Africa, Sama, for unlawful dismissal. The 184 moderators, in the suit filed in March this year, also alleged that Meta’s new content review partner on the continent, Majorel, had blacklisted them on instruction by Meta.
Justice Byram Ongaya of Kenya’s employment and labor relations court on Friday watered down the social media giant’s plan to recuse itself from the case saying the moderators did Meta’s work, used its technology for the work, as well as adhered to its performance and accuracy metrics. The court said that Sama was “merely an agent…or manager.” Sama disputed this, saying “Meta is a client of Sama’s and Sama is not legally empowered to act on behalf of Meta.”
Meta has not replied to a request for comment.
The latest development is a blow to Meta, which has sought to distance itself from the petition saying that it is not the moderators’ employer.
“The evidence is that the obligation to provide the digital work of content moderation belong to the first and second respondents who provided the digital or virtual workspace for the applicants. The first and second respondents exercise control by imposing the operational requirements and standards of performance. The first and second respondent then provided the remuneration back through the agent [Sama],” the court said.
“The third respondent [Sama] was acting as an agent of the owner of the work of content moderation the first and second respondents [Meta Platforms Inc and Meta Platforms Ireland Limited], there is nothing in the arrangements to absolve the first and second respondents as the primary and principal employers of the content moderators.”
Additionally, the court directed that moderators’ contracts be extended and also barred Meta and Sama from laying them off, pending the determination of the case. The court issued the directions saying there was no suitable justification for the redundancies, and that it had “found that the job of content moderation is available. The applicants will continue working upon the prevailing or better terms in the interim.”
Moderators, hired from across the continent, including from Ethiopia, Uganda, Somalia and South Africa, sift through social media posts on Meta’s platforms to remove content that perpetrates and perpetuates hate, misinformation and violence.
The moderators allege that Sama fired them illegally after failing to issue them with redundancy notices as required by Kenyan law. The suit also claims, among other issues, that the moderators were not issued with a 30-day termination notice, and that their terminal dues were pegged on their signing of non-disclosure documents.
Sama, in the past, told TechCrunch it observed the Kenyan law, and communicated the decision to discontinue content moderation in a town hall, and through email and notification letters.
Sama, whose clients include OpenAI, dropped Meta’s contract and content review services and issued redundancy notices to 260 moderators to concentrate on labeling work (computer vision data annotation).
Meta and Sama are facing two other suits in Kenya; Daniel Motaung, a South African, sued the company for labor and human trafficking, unfair labor relations, union busting and failure to provide “adequate” mental health and psychosocial support. Motaung alleges he was laid off for organizing a 2019 strike and trying to unionize Sama’s employees.
Ethiopians filed another suit in December last year over claims that the social media giant failed to employ enough safety measures on Facebook, which, in turn, fueled the conflicts that have led to deaths, including the father of one of the petitioners, and 500,000 Ethiopians during the Tigray War.