Smart glasses could help people who are blind or vision impaired navigate by sound

Augmented reality glasses combine machine vision and AI to add digital information to what we see. Entrepreneur Robert Yearsley is adapting this technology to help people who are blind or vision impaired navigate by spatial sound.

Date published:
General public

Navigating by spatial sound

Robert Yearsley is adapting augmented reality glasses to help people who are blind or vision impaired navigate by spatial sound. He is doing this through his company ARIA Research.

‘ARIA stands for Augmented Reality In Audio,’ Robert tells us. ‘We are building a lightweight computer that fits into a pair of glasses. The computer uses machine vision and AI to understand where you are in space and how to manoeuvre around that space.’

The computer then translates this information into audio. For example, ARIA can detect a tissue box and emit a sound to help users find the tissue box.

‘If I hear the sound of a tissue box and I reach out for it and touch it, the illusion becomes real,’ Robert says. ‘This synthetic augmentation through sound gives the user a vision analogue.’

Codesigning ARIA with consumers

To help codesign the glasses, Robert employs people who are blind or vision impaired as subject matter experts. They work in the engineering, product design and management teams, and sit on the advisory board.

‘It's hard to get a complete vision of what you want,’ Robert says. ‘But putting the end users in front of the engineers and scientists is the absolute key. They explicitly say what we need to do to solve problems.’

One example of codesign was deciding where to place the machine vision cameras on the glasses. ‘The subject matter experts told us we need to position the cameras so they can “see” edges they might fall off,’ Robert says. ‘They also need to “see” hazards at head height that their cane won’t detect, such as tree branches.’

The next prototype

Robert hopes the next prototype of the glasses will be a usable product. ‘Our first prototype was a helmet with a 12 kg box of electronics. That is a science experiment, not a product!’ he laughs.

‘Now we are working on the form factor to fit all the technologies in the arms of the glasses and make them robust. That means fundamental research to come up with new methods for miniaturising the systems, which is super exciting.’

A soundscape of objects users can understand

Another design challenge is translating objects into a soundscape users can understand. ‘The first challenge is to teach ARIA to recognise objects,’ Robert explains. ‘To do this, we are collecting video information about what everyday life looks like. We will use that to rank the importance and distribution of 256 objects that users interact with.’

When the machine vision sees the objects, AI will translate that vision into a soundscape like music. ‘There are notes and tones and cadences that fit well together,’ Robert explains. ‘Users get the information as an enjoyable flow. They can choose to focus on one aspect of the flow or all of it.’

AI will also help users understand the sound information without being overwhelmed by it. ‘One of the roles of AI is to figure out what's important for the user and switch other stuff off,’ Robert notes.

Pilot clinical trial

Robert is conducting a pilot clinical trial to test the safety and usability of the glasses. The team is also testing if the glasses meet the challenges of being blind or vision impaired. ‘We are setting up scenarios like a mock doctor’s office. That lets us test if the device helps users navigate to reception and find a seat,’ Robert says.

Lifelike scenarios help the team answer questions such as: ‘Does the device help people move through space, find things and interact socially? Can they identify if there are people nearby and approach them?’    

Changing lives at scale

‘In ARIA, I found the opportunity to create a piece of technology that can change lives at scale,’ Robert enthuses. ‘Once I saw it in my head, it took about 5 seconds to decide that that's what I'm going to dedicate my life to.

‘MRFF funding has allowed us to gestate and grow the technology and the team in Australia. We are working with some of Australia's leading research scientists and engineers. That means we can leverage local talent and keep the company in Australia.’

The MRFF funded ARIA Research with $2.3 million through the MTPConnect programs BioMedTech Horizons and Clinical Translation and Commercialisation Medtech.

Help us improve

If you would like a response please use the enquiries form instead.