How has artificial intelligence use in hearing technology evolved?

Advances in technology are creating better experiences for users and for healthcare professionals. BIHIMA spoke with Dr. Dave Fabry, Chief Innovation Officer at Starkey to discuss how artificial intelligence (AI) in hearing technology has evolved, how its use has changed so far, and where AI is taking us in the future.

Published on 23 May 2022

How has artificial intelligence use in hearing technology evolved?

BIHIMA: When was artificial intelligence (AI) first used in hearing technology?

Dr. Dave Fabry (DF): I’m an audiologist and I’ve been in the profession for nearly 40 years. I would say the start of using AI in hearing instruments was around 15 years ago. Until then, to equip patients with hearing instruments that could work in quiet and noisy listening environments, we gave them multiple manual programmes that the user could access on their hearing instrument, that would engage directional microphones, for example.

AI was incorporated after we’d gone from analogue to digital hearing instruments. The very first digital hearing instruments were introduced in the mid-90s.

 

BIHIMA: How has the use of AI in hearing technology changed over time?

DF: In the early to mid 2000s we started to see hearing instruments that were able to use machine learning and to be trained to use acoustic environmental classification (AEC) so that patients could go from a quiet to a noisy listening environment, and automatically the hearing instrument would incorporate directionality and noise management appropriate to that specific environment.

The challenge was that, although the automatic process made the user experience easier, the classifiers are only about 80 to 85% accurate because speech is a stimulus, but it can also be noise and it’s very difficult for the hearing instrument to have the same intent that the person would have in terms of choosing who or what to listen to. For example, if a person wanted to listen to a street performer with a guitar that would be stimulus of interest, but if a person was walking by engaged in conversation and didn’t want to listen to the street performer, that music becomes noise, but the hearing instrument wouldn’t automatically know this.

This is where we see the next generation of hearing instrument – where we combine AI with user intent – in the last decade we’ve seen tremendous advances in the way hearing instruments are used.

I believe, for all of us in the industry, a seminal event was 2014 when the first made-for-iPhone hearing instruments were developed – that was really the beginning of when hearing instruments began to transition from single-purpose devices into multi-purpose/multi-function ones that amplify speech and other sounds to improve audibility, reduce background noises, and allow the user to use an application to turn the volume up and down and to control basic hearing instrument functions.

In 2018, we first introduced sensors embedded in behind-the-ear (BTE) and receiver-in-the-canal (RIC) devices. In January 2020, Starkey launched a product that has a feature we call “Edge Mode”. If a person is in a noisy environment, for example in a restaurant, trying to speak to someone in front of them and there’s a noisy person behind them, the user can double tap their hearing instrument without the need for an application on a smart phone and an acoustic scan of the environment is made – spatial location of speech and noise and even music – and the hearing instrument prioritises the stimulus in front of them enhancing the audibility. So, it’s a combination of automated environmental processing plus user intent: combining the benefits of human and machine.

In April 2020, due to the COVID pandemic, people started wearing facemasks, and users found that Edge Mode cleared up voices and provided them with easier communication when they couldn’t lip read. It was an unanticipated use that worked really well. Edge Mode uses AI for sound quality and speech intelligibility, only turning directional microphones on when noise is present.

Bluetooth wireless connectivity has allowed the hearing technology industry to develop accessories which provide outstanding performance along with cosmetic benefits.

 

BIHIMA: How can the use of AI in hearing instruments affect user experience?

DF: The hearing industry is on a journey using AI to monitor and improve overall health and wellness of patients. Sensors incorporated into specific hearing instruments are capable of monitoring social engagement, physical activity, and fall detection.

The use of AI in hearing instruments is not only improving audibility, sound quality, and listening experience, but is being used to support overall health and give peace of mind to the user.

AI can monitor the very specific signature of what happens when a person has a fall. Our intelligent assistant sends text messages to contacts assigned by the wearer, alerting these contacts if a fall event occurs. The hearing instrument user knows when these have been sent and whether the person has opened the message. The person receiving the message can physically locate the person who has had the fall if they don’t receive a reply, thus giving peace of mind to the user as well as their caregiver and family members.

AI can also be used for real-time translation. If a person is having a conversation with someone in a language they don’t understand, they can get the other person’s speech translated straight into their ear.

We also have real time transcription capabilities which can be really useful especially for people who have dexterity issues such as arthritis, and the hearing instrument can use voice recognition and send messages for the person.

Adherence to medication reminders for people on multiple tablets in the aging population is less than 50%. We can programme a reminder into the intelligent assistant for the user to take their medication at the right time.

We also have an app called Thrive Care – the hearing instrument user can send a link to a chosen person, for example a family member, and that person can monitor the user’s activity and social engagement (if the user chooses), providing piece of mind for family members and for the user.

 

BIHIMA: How do audiologists benefit from the integration of AI in hearing instruments?

DF: One thing we know is that telehealth has been around for a while but wasn’t used that much until the pandemic, when many patients felt unsafe to go to face-to-face appointments. We’re seeing a dramatic increase in the use of telehealth since the pandemic. In the past, people perhaps made assumptions that older people couldn’t or wouldn’t want to use technology, but people have taken to it really well and the use of technology in the senior populations has dramatically increased.

About a third of hearing appointments used to be for simple fine-tuning adjustments and in many cases that can be handled just as easily via telehealth, saving travel time for the patient, and saving time for the clinician. For people who are still working and might not have wanted to take time off for a simple adjustment in the past, they may have lived with less-than-optimal settings. Telehealth enables audiologists to optimise acoustic parameters of the hearing instrument better.

We have to broaden our thinking on telehealth. We’ve launched a self-check diagnostic feature that the hearing instrument user can use within the app, and it instantly tests the function of the microphones, the circuitry, the receiver, and even the sensors in the hearing instruments to say whether they’re functioning or not functioning. The clinician benefits from knowing if all the components are working.

Data logging in hearing instruments provides a snapshot for the audiologist to see how the patient is using their hearing instrument. This can also enable the audiologist to see how many falls a patient may have had so they can discuss fall risk with the patient without being dependent on the patient’s memory to recall fall events. The data can turn a sensitive conversation into a productive one and discuss preventative measures that can be taken to avoid falls happening. AI enables the audiologist to work at the top of their scope of practice and offer a holistic approach to their patient care, looking at the connection to overall health and wellbeing.

 

BIHIMA: What does the future look like: how do you see the use of AI in hearing technology evolving?

DF: We’re working on applications that have more computing power than we can currently use on the ear in hearing instrument circuits.

For iPhone users we have a feature called “Voice AI” that has additional computing power and battery life beyond what we can have up at the ear, and this uses deep neural network technology to provide better enhancements for sound quality and speech understanding in noise using the additional computing power on the phone. Right now, this is only for iPhone users with more significant hearing losses and requires the iPhone to be connected to the devices as some of the processing is offboarded on to the phone. Our next generation of products will have the processing done in the hearing instrument itself, so it wouldn’t be limited to iPhone users.

We continue to see that the automation and use of AI in combination with human intelligence provides the best listening experience. Long term we’re trying to provide as effortless a solution as possible. The future is to put as much on board the hearing instrument as possible but of course always keeping audibility and sound quality as the number one priority.

The one thing that can’t be commoditised is caring, and all these developments will help clinicians care for their patients in a more holistic way.

 

ABOUT BIHIMA

 

BIHIMA, the British Irish Hearing Instrument Manufacturers Association, represents the hearing instrument manufacturers of Britain and Ireland, working in partnership with other professional, trade, regulatory and consumer organisations within the health care and charitable sectors. We raise consumer awareness about the latest hearing technology and aim to influence government and policy makers to improve the lives of people with hearing difficulties.

Source: Audio Infos UK issue no. 148, May-June 2022

Karen Noble

In the same section
  • Market
  • Market
  • Profession
  • Profession
  • Research
  • Research
  • UK NEWS
  • Awareness
  • Awareness
  • Research
  • Research
  • Opinion
  • Profession
  • Research
  • UK NEWS