[Readmelater]

India’s AI Boom Runs On Women’s Unpaid Care And Cognitive Labour

In colonial data factories, the knowledge of women and young people is being systematically devalued and aspirations never allowed to scale

This is part two our series on AI labour. Read part one here.

Muskan* sits in front of a mic in a make-shift studio carved out of her room in Govandi, an eastern suburb of Mumbai marked by dense settlements and waste-processing sites. It is midday: there is enough light in the room, and the children are away at school. A phone balanced on a stack of books, earphones plugged in, she records a script on medical information and transcribes interviews, which then become training data for a women’s health chatbot powered by AI. 

Just a little over 40, Muskan taught at a school before she quit and took up this freelance gig with a local NGO that is building a women’s sexual and reproductive health chatbot in partnership with the Gates Foundation. First, she learnt to iterate small sentences – like “delivery didn’t happen” – and then she started playing around with emotions to reflect concern or urgency. Then the scripts became longer, about 45 minutes long on issues of pregnancy, fertility, family planning, which she would narrate with enunciations, and ideally, without background noises. She mumbled and fumbled at times, asked her children to practice with her, but kept going. 

“Slowly the confidence came. Now it feels easy, really easy,” she says.

Without this emotional and linguistic labour–her voice, accent, pacing and emotional delivery–the final AI chatbot cannot reliably interact with users, who are the women from her slum. What will look like an autonomous function is actually sustained by her creative effort. 

In part one of this series on the hidden architecture of intelligent machines and chatbots and platforms, we showed how the intelligence of AI and machine learning rests on an invisible workforce of people who collect, clean, and create data. Still, these continue to be seen as ‘technical’, ‘entry-level IT’ tasks, routine in their make-up, requiring little skill and thus low pay. In this, the second part, we turn to the lives of workers, whose tasks demand attention, context, judgment, and emotional endurance but whose skilled labour will be finally rendered replaceable and risky by opaque AI value chains. This manual labour is further devalued in India, where a history of tech outsourcing feeds the aspirations and economic mobility dreams of many, as we detail later.

The tech economy is reproducing colonial patterns of extraction and dependency, and the historic structures of caste and gendered hierarchies determine whose labour is seen with dignity, says Kai-Hsin Hung, a scholar of labour and value chains at the Interuniversity Research Centre on Globalisation and Work.

Scale of Harm

Rising Worker Vulnerability

Think of AI as a tiered global data economy, with action unfolding across geographies and channels. At the top are firms competing over chips and compute power; AI labs and product companies like OpenAI and Meta build and deploy models. At the bottom of this stack is the layer of human labour of workers like Muskan who keep the data pipeline going. We wrote about the racial logic governing this pyramid earlier: capital and decision-making power are concentrated in the hands of select countries while data and labour are sourced from a dispersed periphery in the Global South.

We found through researchers that contracts given to third party vendors and start-ups for this data work are squeezed on price and vendors compete aggressively on labour costs. This has the effect of lowering wages and creating work arrangements without accountability. The digital factory has no clear boundaries and no obvious site of accountability. Someone in Chittoor could be working at an outsourcing centre, employed by a Hyderabad start-up, to label data for a company in London. 

In India, with declining formal employment opportunities, data work draws in young people and educated women into feminised, low-paid jobs for its promise of flexibility and remote work. At the same time, the rise of generative AI since 2022 has scaled the demand for invisible labour, while also often intensifying workers’ exposure to extreme, violent content

The Union Government has partnered with global tech firms such as Microsoft and OpenAI under IndiaAI Mission to skill hundreds of thousands of developers and establish AI labs across training institutes, without any recognition of the risks and rights in this work. “The gap between the scale of harm and institutional response is widening,” says Kai-Hsin Hung. It leads  to poorer work conditions, health risks, and jobs that erode workers’ skills over time. 

Colonial Data Factories

Before the pandemic, transnational firms carried out AI work remotely but it has now expanded to include BPO and KPOs that work with varied structures, as we reported earlier. One regime is contractual work with no job security or grievance mechanism. In another, workers are formally employed by BPOs and ITeS firms but are more surveilled, and have stricter quotas tied to speed and accuracy. Work is also increasingly being governed by dashboards and faceless algorithms. 

As we said earlier, workers like Muskan take these work opportunities for want of options and the promised flexibility. “I gave many interviews–to be a receptionist, real estate agent, or tours and travels–but there was the age limit, demand for fluent English or unpaid training," she says. She took up teaching but the commute and care work wore her down. “I used to wake up at four in the morning, make food for everyone, pack my tiffin, then take the train, walk five-ten minutes to share an auto, then take another auto to the Vikhroli station, then walk 10 minutes to school,” she recalls.

Her current AI gig may go on for three more months, or six, if she’s lucky. “I didn’t know this world before, but the work gives me peace and I hope it continues,” she says. Working from home now has changed the shape of Muskan’s day. She starts working after her children leave for school, then sometimes for three-four hours in the evening, weaving her paid data work around household chores. “This work feels much better…sitting at home and getting the same payment as other work.” Her husband is an auto driver, and with rising costs and their children’s education fees, the two still struggle but try to make ends meet.

When Gowri, 28, joined a vendor-based company near Chennai she saw it as a path to transition to a more ambitious software engineering role. Like her, many hope that it leads to a “real” tech job. The prestige associated with IT roles makes it easier for women in particular to negotiate with families who may otherwise frown upon women working–a reason why platforms like Amazon Mechanical Turk and Telus saw a surge of women joining during the pandemic.

Another reason is also the rapid spread of outsourcing centres in tier 2-3 towns since the pandemic. This displacement is happening against the backdrop of a larger structural shift where hiring in tier-2 and tier-3 cities has surged. Job openings in cities like Nagpur and Indore grew nearly 42%, far ahead of 19% growth in tier-1 metros, according to a 2025 Randstad report.

The pitch to local workers is simple and persuasive – global IT work is coming home, young people need not migrate for steady pay and a foothold in the tech economy. But women like Ashima*, in her early 20s, realise that these promises are illusory. She was planning to move from Chittoor to Chennai for a Master's course, when a data outcentre company opened its first office in her town. There is work here now, why do you need to move, her family had argued. But it is a decision she came to regret, as we detail later.

“In many cases, it came in the way of women seeking that independence and moving away from home,” says Srravya Chandhiramowuli, a researcher of data annotation work for AI at the University of Edinburgh.

(De)Valued

Broken Education-Employment Link

The Union Government is encouraging young Indians to embrace machine learning skills. Data annotation and model training roles are touted to absorb a large portion of India’s workforce impacted by automation, and by 2030, the market is expected to exceed $7 billion. The government think-tank NITI Aayog believes tasks that workers like Gowri and Muskaan undertake require “low levels of expertise and present an opportunity to exploit labour cost arbitrage to serve companies globally”.

But this assessment is not necessarily correct, interviews show. Hyderabad resident Deepa* spent years annotating geospatial data – tracing roads, identifying signs, correcting maps to the point she can navigate parts of London by memory and at speed. AI trainers too have spoken about how crucial human oversight is.

Srravya’s research too showed that start-up companies hide the labour sustaining them. “You don't want the data annotation team to also grow as you grow as a company. You have to show that you're scaling efficiently, so you displace that work and move it out of sight,” one manager told her. 

What’s damning is that women’s participation in this sector tracks closely with gender norms around cleaning and care, creating parallels to traditional caregiving, says Kai-hsin. Across iMerit, Accenture, Telus, Teleperformance, and other work arrangements, women are often employed in low-level annotation teams, quality assurance, or content moderation. These in practice become “digital cleaning” jobs with workers scrubbing datasets or tidying labels.

This also has consequences for the future of women’s labour participation in India. “For the last 50 years we've been pushing to educate women through initiatives like Beti Bachao, Beti Padhao… but the link between education to dignified employment is broken,” says Aditi Surie, a scholar at the Indian Institute of Human Settlements.

It's not that this work is meant to be essentially numbing but it has been made so because it requires little engagement and is done without transparency, we found. This leaves no room for meaning or creativity in the labour. 

‘No Future In This Role’

Tasks involved in data work are repetitive and decomposed, designed to strip away human judgment, says Kai-Hsin.

That was Gowri’s experience too. She was annotating real-time business insurance data for a company in Chennai, but left after 18 months because there was “no future in this role”. Most days she felt like “just a label marker”, doing “more work than was humanly possible”, and often without the training to handle complex data. She applied to 20 roles over eight months and did not bag any of them.

In Hyderabad, Beena* made a quieter calculation about exiting the system. After two years of working with a data annotation company, she asked for a work experience certificate for only one year. She worried that without any promotion or role change, her stint would look like stagnation. “I don't think I'm really helping myself by saying this is what I did for two years,” she told researcher Srravya. 

In India, studies of Adivasi women working with iMerit found they were channeled into the lowest-paid and monotonous segments. Their knowledge disappeared into the model without recognition or authorship, researchers found.

Then take the way this work is evaluated. There is an overreliance on numbers and strict quotas; in data annotation work, for instance, ability is assessed by how quickly one can draw a bounding box and label an object (it should be under six seconds), or how many tasks one does per hour. 

One worker spent time trying to decipher the difference between the red cap of a Dr Pepper bottle versus that of a Coke. “It should not be anybody's job to know the difference between these kinds of details, and yet, that's what they're trying to learn,” Srravya adds. 

High productivity targets and volume of work also strip judgment and learning out of content moderation tasks. Content moderators in India with companies like Accenture and L&T–working without proper training and mental health support–prioritise quick content work (like removing posts or live streams) over those that require complex thinking.

“Would never recommend de-ranking content as it would take time,” a 28-year-old told a researcher. This additional time would interfere with performance evaluation. This is textbook deskilling, Kai-Hsin says, where complex judgment work is broken into point-and-click operations governed by speed-based metrics, quotas, and dashboards. And without regulation or collective bargaining, these work opportunities lock many into a cul-de-sac.

Research by scholar Nithhya Sambasivan and others showed that qualities such as care, diligence, and contextual reasoning–central to good data work–are systematically undervalued, even though they directly shape the quality and safety of AI outputs.

There are exceptions, such as in Muskan’s case, where there is an attempt to engage with ‘ethical AI’. While recording and transcribing notes, she learns about women’s sexual and reproductive health. She has also begun picking new words and pronunciations and learned how to zoom in and steady the screen. She shared this with other women. “Talking to each other helps, you get solutions,” she says.

Data work needs development skills, as is the case with Muskan, which are undervalued and misidentified—similar to paid care work like that of Accredited Social Health Activists (ASHA) and domestic workers. Workers are the experts on the data sets they produce and train, so their expertise needs to be valued better and recognised, a process that researcher Srravya thinks can begin with expanding their work beyond the scope of a task. 

What if in Humans In the Loop, the company recognised protagonist Nehma’s creativity, treated her as an equal, and encouraged her to upload and tag datasets that reflect the culture and contexts of the Adivasi community? Or what if content moderation for social media giants was organised the way it is for Wikipedia, with guidelines and transparency protocols in place, that incentive accuracy and judgment? “Working with people as equals can make this work far more integrated and meaningful,” says Srravya.

Caring For Machines 

The costs of making AI machines possible often shift to workers’ bodies and household budgets, we found, but neither the government nor employers recognise this. Muskan, for instance, complains of eye strain and back pain due to long hours of sitting and staring at a phone screen. She paid for a doctor’s consultation out of her pocket. She also pays for the invisible infrastructure for her work. “We don’t have Wi-Fi [because it adds to monthly expenses]; everything runs on mobile data,” she says. So the couple decided to wait and see if her gig lasts longer than six months. 

In outsourcing centres with strict targets, where days unfold as doing 120 tasks per hour engaged in numbing work with strict bathroom breaks, workers speak of fatigue and exhaustion. “Workers may not articulate it as mental well being concerns but these issues are distressing and have impacts on workers’ mental and physical health,” says Srravya.

Similar patterns have been documented in Kenya, Philippines, and Colombia. Workers pay for internet connectivity, ergonomic chairs, therapy–especially content moderators who are exposed to violent content. A Kenyan court in 2023 ordered Meta to provide “proper medical, psychiatric, and psychological care” to content moderators who spoke of depression, anxiety, post-traumatic stress disorder, and suicidal ideations as outcomes of filtering pictures and videos of assault, self-harm, child abuse, necrophilia, rape, and more. 

Under the recently enforced labour codes, “gig workers” and “platform workers” are defined only in the Code on Social Security, 2020. But they do not automatically fall within the scope of core protections in the Occupational Safety, Health and Working Conditions Code because they are outside the traditional employer-employee framework. And because they are classified as not “employees” but “independent contractors”, provisions ensuring minimum wages, maternity benefits, and occupational safety are absent unless the government implements specific schemes

Asmita Bhutani, an assistant professor of labour at York University, Toronto, says despite this work being intensive to the body and mind, “no employer or government pays attention to women’s embodied labour or issues related to health and safety”. 

Kenya’s AI workers formed a Data Labelers Association last year to ask for mental health support, fair treatment, and upskilling opportunities. Similar demands echo among Indian workers too but more than everything, they want to be treated as employees, says Asmita: “They are not surfing on their phones doing low-skill work; they are spending time, energy, and cognitive skills…they  want to be treated not as contractors but as employees.”

Past Continuous

India’s Tech Mythification

Unlike India, in Kenya and the Philippines, activism and collective action since the pandemic have led to sustained scrutiny and legislative action. One reason is the visibility of the work itself: Samasource (a contractor for Meta) ran large physical content moderation centres in Nairobi where people met each other and shared work conditions, as compared to Ind
ia where most content moderation work is dispersed, routed through multiple actors, hybrid offices, and remote platforms, as we explained in an earlier report.

Secondly, tech work in India has been framed as meritocratic and aspirational, rooted in prestige and independence. This, like scholar Hemangini Gupta wrote in her book Experimental Times, contributes to a “nationalistic mythification that valourised tech products over the process and labour that goes into”.

When India liberalised its economy, the IT sector and BPOs were seen as these silver bullets to solve employment problems in India and in turn got regulatory leniency. They enjoyed labour protections, such as exceptions from the Industrial Employment (Standing Orders) Act, which happened most recently in Karnataka which extended the exemption running up to 2029. These exemptions weaken standards on termination, working time, and grievance processes–what experts call “regulatory arbitrage”. Rising automation and AI layoffs have led to burnouts, insecurity, and a mental health crisis in India’s IT sector.

Data work is being absorbed into the same architecture. Recent protests tend to focus on forced resignations or retrenchments but sustained advocacy remains difficult. “IT workers are a classical example of people who believe their lies,” says Rikta Krishnaswamy who has organised tech workers earlier. There is another work myth involved here: how can workers with phones and laptops, sheltered in offices or homes, be exploited the same as those doing manual work on the streets, in sweat shops and factories?

Identifying where large-scale data labour is located is itself a challenge, says Alagunambi, an organiser with UNITE, a union for IT and ITeS workers in Chennai whose members also include those data tagging work. Firms like Teleperformance could be key sites, but the work is constantly being subcontracted, reclassified, and relocated.
India’s masculinised idea of tech also distances women. Asmita says: “That's one contradiction — women are a part of the tech economy, they’re instrumental in building up AI systems and contributing to profit for the tech ecosystem, but they're not really seeing themselves as part of that system, nor are the companies pitching them as such.”
Workers do find ways to outmaneuver the system: some finish, according to Srravya, hourly targets just a little ahead of deadline and use the spare few minutes to take “micro breaks”. Others might develop a personal equation with their manager and seek leeway. These acts of resistance become less possible the further you are from the centre where these systems are built, Srravya adds.

Regulatory Challenges

The recent legislative wins focus on location-based gig workers and make no mention of data workers or content moderators. Governments continue to see data workers as an extension of the ITeS and BPO industry, or as ‘future ready’ labour, experts say.

The Digital Personal Data Protection (DPDP) Act, 2023, only governs data governments and user rights, and not the workers around it, the conditions of work, algorithmic management, or labour rights of workers whose labour generates the datasets that fuel AI systems.

Under the Code on Social Security as part of the recently enforced labour codes, data workers remain outside the traditional employer-employee relationship, and core issues such as minimum wages, occupational safety, and maternity benefits are absent. 

India’s national AI strategy and the IndiaAI Mission focus on innovation, compute capacity, datasets platforms, startup financing. They frame ‘responsible AI’ in terms of limiting “technical harm caused by AI applications”, while leaving workplace conditions for people building or training those systems outside the frame.

The forthcoming AI Impact Summit has sessions on “skilling and productivity” of the future labour force, but not an explicit stream dedicated to worker rights or labour conditions for data workers, shows the official agendas released so far. There are talks on compute capacity, datasets platforms, and startup financing, but conversations on labour protections remain in the shadows. 

The new Digital Personal Data Protection (DPDP) Act also only governs data governments and user rights, and not the workers around it or their work conditions.

It is unclear if representatives for these groups were included in the consultations for these bills, says Nighat, a policy analyst with Aapti Institute. “These issues are passed on between ministries of labour, technology and trade, so it’s hard to pin accountability,” she adds.

Kai-Hsin sums up the paradox. Despite frequent rhetoric about ‘responsible AI’ or ‘AI for job’, no regulatory energy is going into protecting upstream data workers. “There is no coherent regime that treats AI data work as hazardous, skilled labour deserving strong rights,” he says.

Further reading:

  • Aapti Institute has a voluntary code of conduct for companies employing data workers.
  • Modern tech economy and AI cannot work without millions of workers training algorithms and labelling data. Kenyan activist Joan Kinyua talks about transparency, basic rights, and fair pay for data workers, and how workers are already pushing back. Read this AI Now Institute interview.
  • A documentary about the about human workers behind AI in Nairobi, and the working conditions that push their physical, economic, and emotional well being to the limits.

  • Saumya Kalia is a Delhi-based journalist who writes about gender, labour, and social equity. She has won the Laadli and REACH Media Awards for her gender journalism, and reported on gender and healthcare as a Dr. Amit Sengupta Health Rights Fellow. At BehanBox she is working on developing editorial series, building quieter spaces, and redefining news engagement across different platforms. She is deeply interested in thinking about grief, care, community, and cities.

Malini Nair (Editor)

Malini Nair is a consulting editor with Behanbox. She is a culture writer with a keen interest in gender.

Support BehanBox

We believe everyone deserves equal access to accurate news. Support from our readers enables us to keep our journalism open and free for everyone, all over the world.

Donate Now