Skip to menu Skip to content Skip to footer
News

Judgement call: the unseen pressures on the people who police the internet

8 August 2025
A line of people sitting at computer screens in an office.

(Photo credit: GCShutter/Adobe. )

Key points

  • Online content moderators in India have high productivity targets, assessing around 1,500 items per 8-hour shift.
  • They often prioritise items that take the least time to assess, which can lead to problematic content staying online, or non-offensive content being removed.
  • Despite dealing with often disturbing images and footage, there is minimal mental health support.

The harsh working conditions of human online content moderators adversely affects internet content, research has shown.

Tania Chatterjee, a joint PhD student from The University of Queensland’s School of Communication and Arts and the Indian Institute of Technology Delhi led a study of the decision-making processes of social media moderators in India.

“Human moderators filter content flagged by an algorithm or users as potentially problematic,” Ms Chatterjee said. 

“This important work is often outsourced by online platforms, and the moderators I interviewed in India assessed content from the United States, South America, Europe and the Middle East.

“What quickly became apparent was the immense pressure and workloads on these people who are employed to keep the internet safe.”

Ms Chatterjee said the moderators in the study were routinely given around 1,500 items to review in an 8-hour shift.

“This means they have just 15-20 seconds to make a contextual judgement with high accuracy,” she said.

“They’re also expected to follow extensive guidelines provided by the platform, often internal and considered trade secrets.

“Meeting targets with accuracy under their employment conditions just isn’t reasonable.”

Ms Chatterjee said moderators commonly made content decisions simply based on what would take the least amount of time.

“Through screensharing I saw how moderators prioritised content they could assess quickly, which left the more complex cases outstanding,” she said.

“In some platforms, problematic content was more likely to be removed altogether because it was a 2-step process, instead of being de-ranked to be less visible online, which took 4-steps.

“The outcome is two-fold – problematic content that should be removed stays online, and content that’s innocent when appropriate context is applied is being removed.

“Moderators are also more likely to rely on automation tools to remove flagged words or phrases, because it’s quicker than making an independent assessment.”

Ms Chatterjee said the moderators they studied were low-paid, given limited job training and rarely accessed mental health support – if it was on offer at all.

“With an employment crisis in India, workers are unlikely to complain about their labour conditions and many moderators are new university graduates getting what they think is a foot in the door in the digital space,” she said.

“Online platforms really need to re-think moderator targets and implement some simple design changes in their portals to streamline processes.

“They should also be more transparent about how much they spend on human moderators, both in-house and outsourced, to ensure it’s proportionate to the amount of content they host. 

“Human content moderators have a crucial role in policing the internet, but our research shows how harsh employment conditions shape the outcomes of their work.”

The research was published in New Media & Society.

Related articles

An empty classroom, as seen looking toward the whiteboard.
Opinion

Queensland teachers are striking. It's not just about money - they are asking for a profession worth staying in

Queensland’s public school teachers walk off the job on Wednesday in their first statewide strike in 16 years.
6 August 2025
A woman in a lecture theatre looking down at a black dog on the ground
Feature

Alpha in the classroom: how a four-legged friend can teach learning theory

A UQ researcher’s loyal companion has made for an unlikely guest lecturer, helping criminology students understand theories behind human behaviour.
5 August 2025

Media contact

Subscribe to UQ News

Get the latest from our newsroom.