fbpx
Advice

Auckland startup Soul Machines creates avatar Nadia to help users navigate the NDIS

- March 13, 2017 4 MIN READ
soul machines

The rollout of the National Disability Insurance Scheme has been welcomed by those in need around the country, however as with many a government service, the process of engaging with it – signing up, finding information, using it – can be difficult, particularly more so for those with a disability.

Looking to make things a little easier is Nadia, a virtual assistant helping users navigate through information about and interact with the National Disability Insurance Scheme online.

Nadia is the first commercial project for Auckland company Soul Machines, founded by Dr Mark Sagar out of the University of Auckland. Previously the Special Projects Supervisor at Weta Digital, Sagar worked on films such as Avatar and King Kong, winning Oscars for his work in 2010 and 2011.

According to Sagar, the vision of Soul Machines, which raised US$7.5 million in funding last year, is to redefine and humanise the way people interact with computers.

“[It provides] a 21st century user interface in the era of Artificial Intelligence, robots and AR/VR using what has always been the most important human-to-human user interface – the human face – as a platform. The application of the technology is vast but the essence is that it augments, rather than eliminates the need for, face-to-face services,” he said.

Sagar explained that one of his early motivations behind the development of Soul Machines was to create a character that could animate itself, which means, in a sense, giving it a sort of digital life force, or putting a ‘soul’ into the machine.

“We took computational neuroscience models of how the brain is thought to work and linked them to biomechanically based models of human faces. Our long-term goal is to better understand human nature through exploring biologically based models of embodied cognition and our fundamental driving forces,” he said.

“Part of this long term goal is to better understand the brain and consciousness. I think the mystery of consciousness is one of the most interesting things in the world; it is after all how we subjectively experience everything. I wanted to choose a mission that would keep me passionately interested for the next 20 years.”

The development of Nadia began a few years ago, when Marie Johnson, head of the NDIS Technology Authority, contacted Sagar after he gave a demonstration of the company’s avatar technology at a conference in Adelaide.

She invited him to speak and network at the NDIS New World conference, where they then began discussing how the Soul Machines technology could be used to improve accessibility and simplify the way people interact with government services.

From there development began. With Soul Machines creating the avatar technology that gives Nadia her face and expressions, a co-design process saw the company work with NDIS technology architect Piers Smith, the Disability Innovation Research Group (DIRG), and the IBM Watson team to bring all the various elements required together.

The IBM Watson team worked with all the partners to teach Watson the fundamental language and concepts behind the range of disabilities, the types of services out there, and mechanisms available to find support, while the NDIS sought feedback from thousands of people with disability and worked with psychologists and scientists from the University of the Sunshine Coast to develop dialogue, and evaluate responses to the avatar and its effectiveness in communicating and conveying emotion.

For Soul Machines, Nadia evolved out of Baby X, developed by Sagar and his engineering research team at the Laboratory for Animate Technologies at the University of Auckland’s Bioengineering Institute.

Described by Sagar as “a working intelligent, emotionally responsive baby”, Baby X can see when you stand in front of her via a camera input that can process facial tracking and voice analysis.

“Her biologically inspired ‘brain’ is capable of displaying her emotional response. For example, you can show her a picture of a sheep and she can recognize the image, and audibly respond with ‘sheep.’ Baby X accurately provides participants with a sense of her being there as they talk and interact,” Sagar explained.

“This cognitive platform is being applied to Nadia’s embodiment; the human-looking face and voice. Nadia is built on some of the most detailed interactive physiological models ever created, so that bone structure and muscle moves exactly as it does in real life.”

The bone structure and muscles are linked to and controlled by neural networks built into an Embodied Cognition Platform (ECP), which essentially is a central nervous system for the avatar. ECO controls the emotions the avatar can show, delivers speech, and can react to the voice it hears and the face it sees through a computer microphone and camera.

Nadia’s character and personality was further brought to life by Blanchett, who volunteered to be her voice; having a close relative with a disability, Blanchett said the project hit close to home for her.

To be trialled on the NDIS’s participant portal, Nadia will be available 24/7 to guide users in their online interactions and transactions with the service through speech rather, with the aim to give people with disability a personal, tailored experience.

“The Nadia interface is designed to be uncluttered and intuitive, and contains only three key components: the avatar window, where people can see Nadia’s face; the chat window, where people can see their conversation history with Nadia; and the input window, where people can either type questions or open their microphone to talk to Nadia,” Sagar explained.

After someone asks Nadia a question, either by typing, speaking, or through another input, Nadia performs three tasks.

The first is understanding, with Sagar explaining, “Nadia uses her natural language capability to interpret the question and form a hypothesis through cognitive reasoning as to what the person really meant.”

The second task is answering, with Nadia’s dialogue capability then serving the user the corresponding response. These have been written by a specialist copywriting team, with selected batches of answers also having been reviewed by an expert in intellectual disability from the psychology faculty at the University of the Sunshine Coast and tested with groups of people from the disability community.

The third task is communicating, with the avatar engine turning the text answer into speech by using the voice library recorded by Blanchett. The audio goes hand in hand with a corresponding facial animation that matches the speech and the emotion that the Agency has seeded for each answer, Sagar explained.

The first iteration of Nadia will be able to answer questions about the NDIS and how it works, with the next generation to build on this by enabling Nadia to answer questions about an individual’s particular situation, including specifics about the user’s plan and plan implementation.

“The following generation of Nadia, also due for release in 2017, will build further on this personalisation capability and will be able to assist people with complex transactions using conversational models of interaction.”

Image: Dr Mark Sagar and Cate Blanchett. Source: Soul Machines.