fbpx
Data

Why artificial intelligences needs to be retrained to improve diversity

- March 11, 2022 3 MIN READ
Robot with digital screen
Photo: AdobeStock
Artificial intelligence (AI) systems must be systemically retrained to eliminate gender bias, industry analysts have advised amidst warnings that male-dominated development teams must be re-educated to build robotic automation systems that better reflect society.

Those workers – which have rapidly become commonplace in everything from logistics centres to police forces and automotive manufacturing plants – have been programmed for a range of situations but will never fully replace humans, noted Sharan Burrow, general secretary of the International Trade Union Confederation (ITUC), which represents over 200 million workers in 163 countries.

“Many employers who started off with the romance of robotry now understand that it’s about augmented workforces,” she told a recent Sydney Dialogue panel discussion about the issues raised by ever-improving AI-based process automation.

“It’s about the technology that allows workers to do their job much easier, much safer, and more inclusively,” she said, citing the example of disabled workers that had traditionally been able to work in manufacturing settings.

To accommodate the broad demographic mix in today’s workforce – and to enable businesses to take advantage of a more diverse set of skills – she said it was important that AI’s development engage with previously marginalised groups.

“The notion that you could have a fully robotic workforce is still something of science fiction, and should remain there,” she said.

“We ask that workers are at the design table, giving the company the benefit of their experience and their knowledge about process, but also being part of the solutions.”

A lack of diversity in engineering and robotics – where women comprise less than 10 per cent of workers – risks limiting the application of innovative techniques and missing out on potential game-changing solutions.

“What are the solutions that we’re not even thinking about because we have such a non-diverse pool of people who are developing those technologies?” pointed out Dr Sue Keay, chair of the Robotics Australia Group.

“There are a number of ways that we can try and address biases in the way these technologies are created, including deliberately programming the technologies to be more representative, and testing them to ensure they aren’t unfairly discriminating against people.”

“For these technologies to develop in a healthy and sustainable way,” she added, “we really need to see a lot more pressure for corporate social responsibility standards to apply – and to make sure companies are held to account for the type of technologies they’re creating to make sure that bias isn’t creeping into algorithms.”

Today’s AI is hindering diversity

Although AI is revolutionising the workplace, most organisations are still far from the inclusive vision that Burrow and others envision.

Citing a recent future-of-work study of 774 Indian companies – which found that 71 per cent had fewer than 10 per cent women and 30 per cent had no women at all – Dr Samir Saran, president of Indian thinktank the Observer Research Foundation, warned that developers of the AI workforce need to take a different approach.

“We have to distinguish between Star Wars and ‘narrow AI’” that is applied to specific functions, as in a manufacturing setting, Saran said, noting that “collaboration of machine with a modestly skilled person in the emerging world allows them to deliver more, and gives them a greater return on their time than they were ever able to get previously.”

With so few women represented in the design of AI systems, he said, “right now we have to undo a lot of historic wrongs. We need an unlevel playing field, and we have to program for that unlevel playing field by adding synthetic data to rectify some of the past.”

Developers need to stop focusing so much on the technology and think more about its social context, he added.

“The hunt for the elusive unicorn using tech models and workspaces is damaging larger constructs,” Saran said, warning that technology education needs to step back from its deterministic focus on AI and take a broader perspective.

“We need to invest huge amounts into social sciences and thinking about the social dimensions of technology now,” he said, “because we are losing the race.”

“Technologists need to be humanised, and the need to understand that they are really coding the future – not coding a bottom line. Automation is going to lead us to the need for a new dialogue.”