AI/Machine Learning

How to think about AI when it comes to including Indigenous Australians

- October 8, 2024 3 MIN READ
Lisa Sarago
Land on Heart cofounder Lisa Sarago
The potential of AI is arguably one of the most exciting and terrifying topics in the tech and business sectors today.

On one hand, there is a real opportunity for AI to address business and societal challenges, while boosting the economy and our overall productivity.

On the other hand, the cinematic portrayals of AI having a mind of its own and doing more harm than good could be closer to reality than many would think.

What’s clear is that we – the people building, designing, and using AI – need to be proactive in addressing potential biases in the technology.

There are stories from around the world of the racial bias in AI tools leading to innocent people being accused of crimes they didn’t commit, or generating imagery that is meant to portray a certain profession or industry but actually brings up stereotypes based on significant gender bias.

In fact, recent research by Charles Sturt University involved using generative AI to create professional images in health and medicine, and despite 54.3% of undergraduate medical students in Australia being women, only 39.9% of the artificially generated images were female. Meanwhile, only 7.7% depicted mid skin tones, and zero per cent had dark skin tones.

 Indigenous voices needed at the table

If there are no Indigenous voices among the teams designing or building AI, there is little chance that Indigenous perspectives, lived experiences, ideas, or innovations will be part of the results of those products. We need diverse voices and thinkers at every stage of AI development, policymaking, implementation and adoption.

Not only will this lead to more inclusive technologies and products, but this would provide our best chance of bridging the digital divide. Currently, remote Indigenous communities are among the most digitally excluded people in Australia, with 43% of Indigenous communities and homelands having no mobile service, and 45.9% of remote Indigenous people being “highly excluded” in comparison to 9.4% of the Australian population.

Imagine if we had Aboriginal and Torres Strait Islander people sharing their own experiences with digital technologies, what works and doesn’t work for them, and then using that lived experience to design AI tools that could bridge the digital gap for their own communities and those across the country.

What follow-on benefits and learnings could be applied to older generations also experiencing digital exclusion, how to better prepare country for fire season, improving how we respond to bushfires, farming communities needing technology to improve the outcomes of their crops, and people with disability who may have specific needs that the current systems don’t accommodate for?

Who’s thinking about AI’s impact on Indigenous communities?

Very few people are proactively and specifically thinking about how AI could affect Indigenous communities. We know that if we are not proactively thinking about the impacts of AI on a specific community or group of people, they are most likely to be left out of experiencing the benefits of AI and, in many cases, experience more harm than good.

This combination of a lack of proactivity and a lack of consideration for Indigenous perspectives is why many Indigenous people are sceptical of AI.

Ironically, by overlooking the needs and inputs of Indigenous communities, those designing AI tools are missing out on a whole lot more. First Nations people were the original inventors. We have tens of thousands of years of experience in innovation. Imagine the range of ideas that could be brought to the table and brought to life if Aboriginal and Torres Strait Islander people were involved in every step of the process.

With AI Month just around the corner, many businesses and entrepreneurs will be celebrating their achievements to date, and the potential of their investments in AI.

It’s important to create space for these discussions, but more importantly, I invite every person currently involved in an AI project of any kind to use this opportunity to take a step back and reflect on where diversity of thought has been welcomed – if at all – in your project.

Have Aboriginal and Torres Strait Islander people been included in the design? Are culturally diverse people part of the testing process? Do users of the tool have homogenous cultural backgrounds?

There are simple fixes to driving diverse outcomes from AI and it starts with taking a proactive approach to applying a cultural lens.

  • Lisa Sarago is CEO and cofounder of Land on Heart and Land on Heart Foundation, and founder of Tiddas in Tech