Africa

170 readers
1 users here now

A space to discuss general stuff relating to Africa.

founded 4 years ago
MODERATORS
1
 
 

Archived link

Kenya is hosting unprecedented lawsuits against Meta Inc., the parent company of Facebook, WhatsApp, and Instagram. Mercy Mutemi, who made last year’s TIME 100 list, is a Nairobi-based lawyer who is leading the cases. She spends her days thinking about what our consumption of digital products should look like in the next 10 years. Will it be extractive and extortionist, or will it be beneficial? What does it look like from an African perspective?

Question: Behind the legal battle with Meta are workers and their conditions. What challenges do they face in these tech roles, particularly content moderation?

Mercy Mutemi: Content moderators in Kenya face horrendous conditions. They’re often misled about the nature of the work, not warned that the work is going to be dangerous for them. There’s no adequate care provided to look after these workers, and they’re not paid well enough. And they’ve created this ecosystem of fear — it’s almost like this special Stockholm syndrome has been created where you know what you’re going through is really bad, but you’re so afraid of the NDA that you just would rather not speak up.

[...]

Content moderation work, annotation work, and algorithm training, [...] in its very nature involves a lot of exposure to harmful content. That work is dumped on Kenya. Kenya says it’s interested in digital development, but what Kenya ends up getting is work that poses serious risks, rather than meaningful investment in its people or infrastructure.

[...]

When the initial version of ChatGPT was released, it had lots of sexual violence in it. So to clean up an algorithm like that, you just teach it all the worst kinds of sexual violence [...] if you ask ChatGPT to show you the worst rape that could ever happen, there are now metrics in place that tell it not to give out this information because it’s been taught to recognize what it’s being asked for. And that’s thanks to Kenyan youth whose mental health is now toast, and whose life has been compromised completely

[...]

Big Tech is not planting any roots in the country [of Kenya] when it comes to hiring people to moderate content or train algorithms for AI. They’re not really investing in the country in the sense that there’s no actual person to hold liable should anything go south. There’s no registered office in Kenya for companies like Meta, TikTok, OpenAI.

[...]

Instead, what you have are these middlemen. They’re called Business Process Outsourcing, or BPOs [...] It’s almost like they’re agents of big tech companies. So they will do big tech’s bidding. If the big tech says jump, then they jump. So we find ourselves in this situation where these companies purely exist for the cover of escaping liability.

[...]

[The workers'] mental health is destroyed – and there are often no measures in place to protect their well-being or respect them as workers. And then it’s their job to figure out how to get out of that rut because they still are a breadwinner in an African context, and they still have to work, right? And in this community where mental health isn’t the most spoken-about thing, how do you explain to your parents that you can’t work?

[...]

I think when you give people work for a period of time and those people can’t work again because their mental health is destroyed, that doesn’t look like lifting people out of poverty to me. That looks like entrenching the problem further because you’ve destroyed not just one person.

[...]

MM: Let me just be very categorical. My position is not that this work shouldn’t be coming into Kenya. But it can’t be the way it is now, where companies get to say “either you take our work and take it as horrible as it is with no care, and we exploit you to our satisfaction, or we, or we leave.” No. You can have dangerous work done in Kenya, but with appropriate level of care, with respect, and upholding the rights of these workers. It’s going to be a long journey to achieve justice.

[...]

[Edit typo.]