Mothering AI, The Adivasi Way: Can Indigenous Knowledge Decolonize Artificial Intelligence?

Share this post on:

Artificial intelligence is often presented as a neutral, objective, and universal technology, a disembodied brain crunching data to arrive at “truth.” But what if this supposed neutrality is a dangerous illusion? What if AI, like so many technologies before it, is deeply embedded with the biases and worldview of its creators—predominantly Western, often male, and largely disconnected from other ways of knowing? In a sharp critique inspired by the film Aramya, Abhik Bhattacharya argues that the “Western gaze” dominates AI, silencing Indigenous perspectives. He proposes a radical alternative: could an “Adivasi way” of knowing offer a path to decolonize AI and create a more just and holistic technological future?


The Information Box

Syllabus Connection:

  • Paper 1: Chapter 1.7 (Anthropology of Knowledge/Epistemology), Chapter 12 (Postmodernism/Postcolonialism), Chapter 1.7 (Digital Anthropology)
  • Paper 2: Chapter 7 (Role of Anthropology in Tribal Development), Chapter 1.1 (Indian Civilization: Tribal Contributions)
  • GS-3/Essay: Artificial Intelligence, Ethics, Indigenous Knowledge Systems

Key Concepts/Tags:

  • Decolonizing AI, Indigenous Knowledge Systems (IKS), Postcolonial Theory, Epistemology, Western Gaze, Aramya (Film), Adivasi

The Setting: Who, What, Where?

This case study is an analysis of the ideas presented in Abhik Bhattacharya’s critique, which uses the fictional film Aramya (directed by Sahay) as its central example. The film depicts Nehma, a divorced Adivasi woman in Jharkhand, working in a data-labeling center for a US-based AI company. Her job involves teaching AI to identify pests. The core conflict arises when Nehna refuses to label a specific caterpillar as “pest,” arguing from her Indigenous perspective that “This is not a pest. It only eats the rotten part of leaves.” This refusal becomes the catalyst for exploring the clash between Western, reductionist AI logic and holistic, Indigenous epistemologies.

The Core Argument: Why This Study Matters

This critique uses a fictional scenario to diagnose a profound real-world problem in the development of AI.

  1. AI is Not Neutral; It Reflects the “Western Gaze”: The central argument is that AI data feeding is inherently ideological. The systems are being trained predominantly based on Western categories, assumptions, and values. Bhattacharya argues this is a continuation of a colonial pattern where Western ideas (like secularism, liberalism) were presented as “neutral” and universal, while other ways of knowing were marginalized. AI, in its current form, risks becoming a powerful new tool for reinforcing this Western epistemic hegemony.
  2. The Silencing of Indigenous Perspectives: Nehna’s specific act of resistance in the film—refusing to label the caterpillar as “pest”—is used as a metaphor for the larger silencing of Indigenous Knowledge Systems (IKS). Her holistic understanding, rooted in observation and a deep relationship with the forest ecosystem (knowing the caterpillar only eats rotten leaves), is incompatible with the AI’s simple, binary, and decontextualized logic (“pest” or “not pest”). This silencing is not accidental; it is a structural bias in how AI models are currently trained.
  3. An “Adivasi Way” as an Alternative Epistemology: The critique proposes that Indigenous ways of knowing offer a necessary alternative. It highlights Nehna’s relationship with nature (symbolized by a porcupine guide) as representing a non-dualistic, relational epistemology. Unlike the Western model that separates human/nature, subject/object, the Adivasi perspective sees an interconnected whole. This holistic view, Bhattacharya suggests, could lead to AI systems that are less reductionist and more attuned to complex ecological and social realities. It’s a call to “mother” AI differently.

The Anthropologist’s Gaze: A Critical Perspective

  • Postcolonial Critique of Technology: This analysis is a classic example of applying postcolonial theory to technology. It argues that AI, far from being a post-historical or neutral force, is deeply embedded in historical power relations and risks perpetuating colonial patterns of knowledge domination. It forces us to ask: whose worldview is being encoded into the algorithms that will shape our future?
  • The Anthropology of Knowledge (Epistemology): The case study is fundamentally about epistemology—the study of how we know what we know. It contrasts two different epistemes: the Western, analytical, mathematical model favored by current AI, and an Indigenous, holistic, relational model embodied by Nehna. The critique champions the validity and necessity of the latter, arguing for a more pluralistic approach to knowledge in technology design.
  • Can AI Truly Be “Mothered”?: A critical question arising from the essay’s central metaphor is whether AI can be “mothered” in an Adivasi way. Can a technology fundamentally based on binary logic, vast datasets, and mathematical categorization ever truly incorporate a holistic, relational, and often non-quantifiable Indigenous worldview? Or will the attempt inevitably lead to a superficial appropriation or distortion of that worldview?

The Exam Angle: How to Use This in Your Mains Answer

  • Types of Questions Where It can be Used:
    • “Critically analyze the impact of globalization and modern technology on indigenous communities and their knowledge systems.”
    • “Discuss the relevance of postcolonial theory in contemporary anthropological analysis.”
    • GS-3/Essay: “What are the ethical challenges posed by Artificial Intelligence? Discuss the need for diverse perspectives in AI development.”
  • Model Integration:
    • On Indigenous Knowledge: “The relationship between modern technology and Indigenous Knowledge Systems (IKS) is complex and often fraught with bias. As critiques like Abhik Bhattacharya’s analysis of AI development argue, the dominant ‘Western gaze’ in data labeling often silences or invalidates holistic, relational Indigenous epistemologies, highlighting the need to decolonize technology.”
    • On Postcolonial Theory: “Postcolonial theory remains highly relevant for analyzing contemporary power dynamics, even in technology. The argument that AI, in its current form, risks reinforcing Western epistemic hegemony by marginalizing non-Western ways of knowing (as explored through the fictional example of Nehna in ‘Aramya’) demonstrates how colonial patterns can persist in new domains.”
    • For GS-3/Essay on AI Ethics: “A major ethical challenge in AI is algorithmic bias, which often stems from a lack of diverse perspectives in its development. Critiques argue for incorporating non-Western and Indigenous epistemologies—an ‘Adivasi way’ of knowing—to create AI systems that are less reductionist and more attuned to complex social and ecological realities, moving beyond a purely ‘Western gaze’.”

Observer’s Take

Abhik Bhattacharya’s critique is a vital and urgent intervention. It takes the seemingly abstract world of AI algorithms and grounds it in the lived realities and profound wisdom of Indigenous communities. The story of Nehna and the caterpillar is a powerful parable for our times, warning us that in our rush to build an artificial intelligence based on Western logic, we risk creating a machine that is not just biased, but fundamentally blind to the complex, interconnected nature of the world. The call to “mother AI the Adivasi way” is a radical plea for epistemic justice—a demand that the technologies shaping our future must learn to listen to, and value, the diverse ways of knowing that have sustained human societies for millennia.


Source

  • Title: Mothering AI, The Adivasi Way
  • Author: Abhik Bhattacharya
  • Publication: Indian Express
Share this post on:

Leave a Reply

Your email address will not be published. Required fields are marked *