Artificial Intelligence In Hearing Aids

While Artificial Intelligence (AI) has many applications, it can also feel like a marketing buzz-word attached to product names or descriptions. It can become difficult for hearing aid users to understand what an AI really means or what it is doing.

Artificial intelligence (AI) continues to advance and impact our daily lives. It is sad to say, but I don’t even have a paper map in my car anymore, I use the GPS downloaded into my smartphone for directions. I speak into my iPhone and say where I want to go, and a friendly voice gives me turn by turn instructions and informs me when I will arrive at my destination, taking into account the actual local traffic conditions at the time.

Even Hyundai, the Korean automobile manufacturer, has jumped into the AI arena developing a technology that is helping hearing-impaired drivers who mainly depend on their senses of sight and touch.

Hyundai’s system uses AI to analyze external sounds.The cars monitor the external environment through visual portrayals of sound patterns, such as the warning sounds of approaching emergency vehicles, as pictograms on a head-up display. The steering wheel is equipped with multicolored LED lights that help drivers navigate. They also transfer sound data into vibrations through the steering wheel, giving the driver information about external conditions such as distance from obstacles.

Use of Artificial Intelligence In Hearing Aids

Artificial intelligence is changing the way health professionals work and is helping them arrive at important medical decisions; the same is true for hearing health professionals.

In 2005, Robert Margolis, Ph.D., Professor, Director of Audiology, University of Minnesota, recounted the history of diagnostic tests in audiology and the how the profession has not kept pace with the advancement of technology. He quoted James Jerger,  Ph.D., (audiology) who in 1963, stated that most all of the routine hearing testing conducted could be done by use of a machine while the audiologist would be consumed with analyzing and interpreting the results. Margolis summarized, “My opinion is that if our profession does not embrace automated testing and drive the process of progress, other professions will do it for us.” Many years have passed since Margolis’ comments and automated audiometry is still not the standard of care for the hearing care industry, but progress is being made specifically in the area of hearing aid technology.

In 2004, Oticon was the first to use artificial intelligence in hearing aids by use of comparing outcomes of particular feature set combinations, attempting to provide a better voice-to-noise ratio in any given sound environment. This example of rapid comparison technologies provided options without utilizing a closed-loop machine learning technique. 

Fourteen years later, in April 2018, Widex introduced the first hearing aid which employs machine learning to optimize the users hearing experience. Inputs from wearers via a smartphone app are combined with real-life hearing situations. Widex reports that the machine learning moves beyond simple decision-making to assist the individual by using the anonymous aggregate data collected from Widex Evoke wearers from around the world to benefit all other Widex Evoke users. This step in technological advancement encourages hearing health practitioners to take advantage of more sophisticated artificial intelligence systems to better serve our clientele.

A recent study at the University of Hong Kong and Ann & Robert Lurie Children’s Hospital of Chicago, Artificial Intelligence has been used to predict language development in deaf children with cochlear implants. A machine-learning algorithm used pre-implant MRI scans to detect abnormal brain development. This provided clinicians information about how to personalize post-surgery efforts to help those children develop better hearing and language skills.

Hearing aid providers have used AI technology to initially develop target-based fittings for years. This advanced AI use can help move hearing aid fitting outcomes.

After the initial fitting and back in the real world, what an individual wants to hear varies depending on the many factors that make up that specific sound environment they are in at that specific time. They might want to lower the conversation level around them if they’re sitting by themselves in a coffee shop, or conversely, increase the conversation level if they are with friends. If listening to music, a person may want to accentuate a certain element of the music. How, who or what an individual wants to hear is described as their listening intention.

Hearing aid sound processing is based on algorithms and rules that are driven by certain assumptions. These assumptions are incorporated into the automatic hearing aid processing, with the aim of optimizing the user’s ability to hear in different sound environments. Using this assumption-based approach, users can be moved closer to overall satisfaction, but there will still be listening needs that cannot be met by automation.

Often, unmet hearing needs result in fine-tuning appointments with a hearing care practitioner, but it can be difficult to explain the specific situation that caused problems. Instead, the Widex Moment and Evoke hearing aids provide the wearer a powerful tool to do the fine-tuning in real time while in that situation.

While hearing aid engineers endeavor to make the most automatic hearing devices possible, it seems that there will always be situations where the listening intention of the wearer is better met via powerful and effective personalization in the moment. Widex Moment and Evoke hearing aids are changing the way end-users interact with their hearing aids, providing them with opportunities for immediate personalized improvement based on their own specific desires, intentions and preferences.

For more information on Widex Moment and Evoke hearing aids, CLICK HERE.

Leave a Reply

×

Cart