• Thu. Nov 28th, 2024

Kenyan workers win High Court appeal to take Meta to trial

Byadmin

Oct 27, 2024




In September 2024, the Nairobi Court of Appeal ruled that two cases brought by 185 former Facebook and Instagram content moderators against Meta – one over allegedly poor working conditions and another over their mass-firings – should proceed to trial.

The ruling follows an 18-month battle between the moderators and their former employer Meta, which has insisted that the Kenyan courts didn’t have the power to hear the cases. Although the tech giant had previously appealed against earlier decisions from the High Court in the two cases, both of which around content moderators’ working conditions, the court has now ruled against these appeals.
The cases were initiated by former Facebook content moderator Daniel Motaung, who claims that Meta unlawfully subjected him and his colleagues to exploitation which harmed their mental health. Motaung began to organise with his co-workers, forming a union to fight back against the exploitation of him and his colleagues. The action was squashed by Meta, with Motaung and 185 of his colleagues being unlawfully fired, in an union-busting effort from the Silicon Valley giant.
Motaung was just one of many of the content moderators who had their jobs cut, leaving them with no access to employment. Soon after, it came to light that Meta planned to switch outsourcing companies at its Nairobi hub from Sama to Majorel, in effect blacklisting these workers from returning to work, and prompting the workers to collectively launch a second case against their mass-firing
In a victory for the workers, the Court of Appeal ruled that both cases have jurisdiction to proceed to trial in Kenya.
On top of the money owed to them by Meta and its contractors, the workers are seeking in improvements in their working conditions, and are calling for the company to uphold their right to speak out about poor conditions and to join a trade union, as well as introduce a system of mental health support in Kenya akin to those provided to employees in its Menlo Park and Dublin HQs.
Computer Weekly contacted Meta about the court decisions and every aspect of the story, but received no response.

A pattern of multinational exploitation
The long-awaited decision has given content moderators hope that they will get justice for the exploitation Meta has subjected them to. Kauna Malgwi is one of the 185 former Facebook content moderators bringing the legal challenge against Meta and Sama. She is the chairperson of the Nigeria chapel of the African Content Moderators Union, and was featured in TIME’s 100 AI list.
“After nearly two years of time-wasting, I was elated to know that the day will soon come when we face Meta across a courtroom to hear them answer for their exploitation and abuse of myself and my colleagues,” she says.
Working for Facebook, Malgwi moderated content which included videos and images of rapes, suicides and war atrocities. Despite the mental strain faced by workers, Meta have tried every tactic they can to prevent the case from progressing to trial.
“The courts take a long time and Meta has hired lawyers to delay our case as much as they can with dirty legal tricks and bad faith offers of mediation that ultimately went nowhere. You would think one of the most famous and powerful companies in the world would not need to stoop so low, but we have seen that there is no tactic too shameful for Meta to attempt,” says Malgwi.

Meta has hired lawyers to delay our case as much as they can with dirty legal tricks and bad faith offers of mediation that ultimately went nowhere

Kauna Malgwi, African Content Moderators Union

This is not the first time that action from content moderators in Africa has strived to hold Big Tech to account. The formation of the African Content Moderators Union by 150 African artificial intelligence (AI) workers – which seeks to secure better working conditions for content moderators, microworkers and data labellers – was an act of historic defiance against big tech, with one of its organisers Richard Mathenge named as one of TIME’s 100 most influential people in AI.
The union’s content moderators went on to win a watershed court case in Kenya, which ordered Meta to provide “proper medical, psychiatric and psychological care” in June 2023. Similarly, in Columbia, outsourcing platform Teleperformance signed a historic agreement to allow its 40,000 workers the right to form a union, following an investigation which exposed the dire working conditions of TikTok content moderators.
Those Computer Weekly interviewed say that the exploitation of content moderators in the Global South follows a pattern of multinational corporations and their race to the bottom for cheap labour.
Nairobi has become an epicentre of the AI outsourcing race, largely due to high levels of unemployment coupled with an increasingly educated youth population, and the capital’s high rate of English speakers, with outsourcing companies such as Sama offering entry-level jobs in tech.
A TIME investigation for example found in 2023 that OpenAI paid microworkers in Kenya between $1.32 and $2 an hour to review toxic content, labelling data and removing harmful, violent and graphic content.

‘Essential work’
Some argue that microwork – the work that content moderators and data labellers do – should be viewed as essential work, which might help improve working conditions for these workers.
“Until we treat content moderation as real digital work that is needed and not easily automated, we will fail to value or see these workers. Content moderation is a cross between people responding to 911 calls and a park ranger,” says Mary L. Gray, senior principal researcher and co-author of Ghostwork.
“They deserve a decent workplace, hours that recognise the challenges of making snap judgements, and the ability to organise and collectively bargain to improve the quality of the work that they do for all consumers of social media.”
The poor working conditions facing content moderators and other microworkers are mirrored industry-wide. James Oyange, a former TikTok content moderator and organiser with the African Content Moderators Union, says: “As content moderators from a diverse range of tech giants like TikTok, Facebook and ChatGPT, we observed a common thread of concerns and challenges that were consistently left unaddressed.
“The arduous nature of our work, coupled with the lack of mental health support and recognition, led us to unite in pursuit of a more equitable and just environment for all content moderators and AI workers.”

A potential step change
But this latest ruling in Nairobi could help to swing the pendulum and force big tech companies to recognise the value of these workers’ contributions and treat them accordingly.
Foxglove co-executive director Martha Dark, for example, said: “This ruling shows that despite Meta’s eye-watering resources, it can be beaten. And it has been beaten every time it has made the ridiculous, neo-colonial argument that Kenyan courts do not have the power to hear cases against an American company.”
Broader public awareness and understanding of the conditions of the workers that maintain our digital world is needed to ensure this decision goes the distance. As Adio Dinika, research fellow at the Distributed Artificial Intelligence Research Institute tells Computer Weekly: “We need sustained public awareness, policy changes and corporate accountability to address the systemic issues in the content moderation industry.
“The impact of this win could be far-reaching, potentially influencing how tech companies structure their content moderation operations globally. It may lead to improved working conditions, better mental health support and fairer compensation. Ultimately, this case serves as a wake-up call for the tech industry to recognise the human cost behind our sanitised digital experiences.”
When explaining what keeps her fighting, Malgwi reminds us that we have more in common with content moderators than we do with tech CEOs such as Mark Zuckerberg, and this is what makes galvanising to take collective action against big tech so important.
“Taking on giants like Meta is hard, but essential if we are going to protect ourselves from exploitation by companies that are more powerful than most countries. We need to connect, organise and fight back against Meta across borders. We have taken the lead on this work in Africa, but now it needs to go further,” she says.
Dark adds the message content moderators should take from the High Court decision: “Meta is scared – it’s scared because it knows that the charges laid against it by the Nairobi moderators are true. That’s why it refuses to engage with them, instead deploying fancy legal tricks and delaying tactics to dodge the issues. It’s nearly out of road – and we’re excited to see them in court.”



Source link