Thursday, Mar 28, 2024 | Last Update : 07:24 PM IST

  Technology   In Other news  16 May 2020  Why India needs to be the centre for content moderation reform

Why India needs to be the centre for content moderation reform

THE ASIAN AGE. | ROHAN SETH
Published : May 16, 2020, 3:10 pm IST
Updated : May 16, 2020, 3:10 pm IST

A significant number of moderators for most platforms are contracted through companies such as Cognizant, Genpact, and Accenture.

Facebook and other online platforms have been trying to minimise hate speech on social media, by getting content moderated.
 Facebook and other online platforms have been trying to minimise hate speech on social media, by getting content moderated.

Earlier this month, Facebook settled a lawsuit that required the company to give $52 million in a settlement to content moderators suffering from mental health issues. In that light, next time when you browse through your Facebook/Instagram feed, take a moment to realise what a miracle it is that the content there tends to be suitable for consumption.  When social media platforms reached hundreds of millions of monthly active users (credit to Chamath Palihapitiya for coming up with the metric), they brought a lot of humanity’s worst instincts online. 

Think child sexual abuse, cannibalism, animal cruelty, and violence towards infants. Thanks to easy access to the internet and the cheap affordances of smartphones, all of this is posted to platforms like 8chan, Facebook, and Instagram. There are two significant points of failure here. Firstly, the majority of this content is to be moderated by humans.  This leads to a whole host of mental health issues, best documented by Casey Newton from the Verge. In a detailed report on the topic, Newton writes, “It is an environment where workers cope by telling dark jokes about committing suicide, then smoke weed during breaks to numb their emotions”. In addition, “In hope for a dopamine rush amid the misery (people) ..have been found having sex inside stairwells”.  A significant number of moderators for most platforms are contracted through companies such as Cognizant, Genpact, and Accenture. As standard practice, they are asked to sign non-disclosure agreements asking them not to reveal details about their work to even their families. Moderators going through hundreds of content decisions a day tend to suffer from Post-traumatic stress disorder (PTSD). They are often in need of a therapist which is not provided or insured. 

Secondly, it is hard to maintain standards of free speech when platforms operate in more than a hundred countries. Not all countries are democracies, and even in democracies, there are differences in what is acceptable under the free speech umbrella. As a platform, if you maintain American standards of expression, you may be blamed for not adapting well enough to your surroundings. Alternatively, it is hard to build capacity and conform to speech standards in over a hundred languages.

As a result, standards of expression is a dynamic process, with guidelines for what is acceptable being constantly updated. On a side tangent, that is partly why a Facebook ‘Supreme Court’ was constituted.

In light of all of this, let us try to make sense of the $52 million settlement because there is a lot of fine print to cover. While the sum is the most significant acknowledgement by Facebook about how damaging content moderation can be for ‘employees’, it does not apply to moderators in all countries. Specifically, the lawsuit covers only people who have worked for Facebook through third-party vendors in the United States from 2015 until today, (estimated to be 11,250 people). 

While the sum itself is significant, I would argue that changes to how content moderation takes places are worth more. Learnings from this settlement should not be limited to the US, but instead, applied globally, starting with India. 

The reason I emphasise India, is because two forces make content moderation in India a significant pain point. Firstly, India is one of the global capitals for the BPO industry. The critical reasons for that are the massive user base as well as a population that speaks English as a second language. You may think that content moderators generally work from a dingy basement lit by computer screens. However, the reality is that they are based in big corporate buildings in Gurugram. 

Secondly, going to therapy is taboo in India, even among urban elites. So moderators facing mental health challenges may find it hard to talk about them, and their pleas might even fall on deaf ears at home and in the workplace. 

According to The Verge, the settlement makes meaningful changes to content moderation tools that may help in mitigating mental health issues caused by the job. And it is these changes India’s content moderation industry should be banking on. Some of these tools include changing videos to black and white and muting audio by default. In addition, the settlement includes increasing availability of mental health professionals to moderators.  The latter includes not just counsellors (who are known to be more worried about getting employees back to work instead of caring about their mental health), but also individual and group therapy sessions. 

You could put a price tag on what it costs to keep platforms clean of harmful content. $52 million is a good starting point (and an underestimation). But the learnings that come out of this experience have the potential to be priceless. Not just in terms of how much money they can potentially save in counselling costs, but in terms of preventing the mental harm that content moderation causes people who undertake it.

Tags: facebook, content moderation, hate speech, child sexual abuse