Event
Responsible AI
January 23, 2023

‘Decolonising AI? A View from the South’: MozFest 2022

What does it mean to decolonize AI ? What are the capacities and institutions that needed to meaningfully shift power in our AI futures?
Download PDF
‘Decolonising AI? A View from the South’: MozFest 2022
illustration by:

Recent years have seen scholars drawing attention to the power asymmetries at play in the production and dissemination of AI. Some contend that this is a continuation of earlier colonial logics. Much of this discussion has been led by thinkers from industrialized economies. Ironically, voices from developing countries are notably absent.

In an attempt to bring such voices to the fore, Digital Futures Lab and Research ICT Africa held a session on ‘Decolonising AI’ as part of MozFest 2022. The workshop brought together scholars from developing countries and various civil society organizations to speak about what decolonisation meant for them; whether and how it was possible; and the capacities and institutions that would be needed to meaningfully shift power in our AI futures.

Particular attention was drawn to how narratives of decolonisation and digital sovereignty are  appropriated by state actors and domestic industry bodies in the Global South, to further their own interests. The following are some of the key points from the  session.

  • Anja Kovacs of the Internet Democracy Project pointed out that while it is important to think of data sovereignty in terms of the state and corporations, owing to geopolitics, it is also important to consider how data collection is actually intertwined with the bodies of the users it is collected from. 
  • Chidi Oguamanam, a Professor of Law at the University of Ottawa, urged us to take colonialism as a metaphor. In the context of AI, decolonisation would mean ridding the conception, design, operations, and applications of all forms of biases that perpetuate asymmetries and hierarchies. 
  • Arthur Gwaga, a doctoral researcher at Utrecht University, pointed out that there is a need for the involvement of the Global South not only at a tech or policy level, but also at the level of decision-making in the processes of design. 
  • Kiito Shiliongo from Research ICT Africa opined that the production of AI mirrors the West’s viewing of the human as a rational agent. She introduces the concept of Ubuntu, an idea that allows for communitarian notions of control over data. 
  • Maya Ganesh, Senior Research Fellow, Leverhulme Center for the Future of Intelligence stated  that AI is a set of knowledge-making technologies; knowledge of the past, present, and future of the human and non-human worlds. She envisions decolonising AI as a mode or means to dignity, justice, intellectual freedom, and a refashioning of creative scientific curiosity and social progress. 
  • Divij Joshi, Doctoral Researcher, University College London, pointed out that decolonising AI involves understanding the epistemic basis of the current practices and challenges of data science and artificial intelligence, in context of the knowledge infrastructure we are trying to build. Paramount of which is recognising historical contingencies within which these technologies are rooted. 
  • Padmini Ray Murray opined that databases in India are built on the assertions, assumptions, and questions of dominant caste elites. Ray Murray  cites the example of AgriStack - a technological solutions infrastructure for agricultural practices offered by the Government of India in collaboration with private partners like Microsoft  - excludes female farmers since they are not land-owners. Accordingly, she indicates how colonial logics are re-appropriated by local elites.   
  • Amrita Nanda, Researcher, Aapti Institute opined that at the level of governance of AI, data policy needs to actively work against the discriminatory footprints established  by  colonial legacy. Furthermore, while building towards empowering communities, one must make sure that they are not burdened with both the infrastructure and its maintenance. 

Their differences aside, the panelists seemed to agree that (a) there exists a de facto power asymmetry between the subjects that have data-infrastructural capacities and those that do not, and (b) the logic of this asymmetry may fruitfully be framed by looking at it as a continuation of a ‘colonial legacy.’   

Browse categories

Scroll right
illustration by:

‘Decolonising AI? A View from the South’: MozFest 2022

What does it mean to decolonize AI ? What are the capacities and institutions that needed to meaningfully shift power in our AI futures?

Recent years have seen scholars drawing attention to the power asymmetries at play in the production and dissemination of AI. Some contend that this is a continuation of earlier colonial logics. Much of this discussion has been led by thinkers from industrialized economies. Ironically, voices from developing countries are notably absent.

In an attempt to bring such voices to the fore, Digital Futures Lab and Research ICT Africa held a session on ‘Decolonising AI’ as part of MozFest 2022. The workshop brought together scholars from developing countries and various civil society organizations to speak about what decolonisation meant for them; whether and how it was possible; and the capacities and institutions that would be needed to meaningfully shift power in our AI futures.

Particular attention was drawn to how narratives of decolonisation and digital sovereignty are  appropriated by state actors and domestic industry bodies in the Global South, to further their own interests. The following are some of the key points from the  session.

  • Anja Kovacs of the Internet Democracy Project pointed out that while it is important to think of data sovereignty in terms of the state and corporations, owing to geopolitics, it is also important to consider how data collection is actually intertwined with the bodies of the users it is collected from. 
  • Chidi Oguamanam, a Professor of Law at the University of Ottawa, urged us to take colonialism as a metaphor. In the context of AI, decolonisation would mean ridding the conception, design, operations, and applications of all forms of biases that perpetuate asymmetries and hierarchies. 
  • Arthur Gwaga, a doctoral researcher at Utrecht University, pointed out that there is a need for the involvement of the Global South not only at a tech or policy level, but also at the level of decision-making in the processes of design. 
  • Kiito Shiliongo from Research ICT Africa opined that the production of AI mirrors the West’s viewing of the human as a rational agent. She introduces the concept of Ubuntu, an idea that allows for communitarian notions of control over data. 
  • Maya Ganesh, Senior Research Fellow, Leverhulme Center for the Future of Intelligence stated  that AI is a set of knowledge-making technologies; knowledge of the past, present, and future of the human and non-human worlds. She envisions decolonising AI as a mode or means to dignity, justice, intellectual freedom, and a refashioning of creative scientific curiosity and social progress. 
  • Divij Joshi, Doctoral Researcher, University College London, pointed out that decolonising AI involves understanding the epistemic basis of the current practices and challenges of data science and artificial intelligence, in context of the knowledge infrastructure we are trying to build. Paramount of which is recognising historical contingencies within which these technologies are rooted. 
  • Padmini Ray Murray opined that databases in India are built on the assertions, assumptions, and questions of dominant caste elites. Ray Murray  cites the example of AgriStack - a technological solutions infrastructure for agricultural practices offered by the Government of India in collaboration with private partners like Microsoft  - excludes female farmers since they are not land-owners. Accordingly, she indicates how colonial logics are re-appropriated by local elites.   
  • Amrita Nanda, Researcher, Aapti Institute opined that at the level of governance of AI, data policy needs to actively work against the discriminatory footprints established  by  colonial legacy. Furthermore, while building towards empowering communities, one must make sure that they are not burdened with both the infrastructure and its maintenance. 

Their differences aside, the panelists seemed to agree that (a) there exists a de facto power asymmetry between the subjects that have data-infrastructural capacities and those that do not, and (b) the logic of this asymmetry may fruitfully be framed by looking at it as a continuation of a ‘colonial legacy.’   

Browse categories

Scroll right