AI-Powered hearing aids, facts and fictions
- Koorosh Nejad

- 3 days ago
- 13 min read
"Modern hearing aids powered by AI (artificial intelligence) and DNN (deep neural network) processing are far more than simple sound amplifiers: they are adaptive, intelligent systems capable of learning from the user, optimising sound in real time, and enhancing quality of life in ways unthinkable just a decade ago". You might have heard this line or a similar one from your audiologist, or those YouTube videos promoting a specific model of hearing aid, but is it true, or fiction?
While the above statement is partially true, there is a lot of buzz around AI-powered hearing aids, mostly a sales pitch in my opinion. I feel that there is a need to clarify AI in hearing aids for my patients, what is real and available today, and what part is fiction (or let's say, coming soon).
In this article, I explore how the AI processing works inside hearing aids, what it can do, and why it represents a major leap forward for people with hearing loss. This article is for you if you are interested to learn more about advanced hearing aids or if you are in the market to purchase a pair of new AI-powered hearing aids.

Related articles:
The technologies powering AI hearing aids
Deep Neural Networks (DNNs) - These are advanced AI algorithms inspired by the human brain. In hearing aids, DNN sound processing excels at separating speech from background noise — even in complex acoustic situations like busy restaurants or crowded gatherings. What hearing aids refer to as AI is the use of a speech pattern recognition modelled based on the human brain and programmed in the hearing aids. The DNN processing is already available in several AI-powered hearing aids. The next step will be for the hearing aids to recognise familiar voices and prioritise them for clearer communication. This part is yet to become available.
Machine Learning - Machine learning is a type of AI that allows a device to improve over time by identifying patterns in data or user behaviour (e.g. Amazon system learn your shopping habits) — in this case, the wearer’s sound environments and preferences. Hearing aids equipped with machine learning could adapt to a user’s habits and automatically adjust settings based on past experiences. These features are at a fancy stage in hearing aids and have a long way to go.
What does “AI” really mean in hearing aids?
At its core, AI refers to computer systems designed to perform tasks that traditionally require human intelligence — such as recognising patterns, learning from experience, and making decisions automatically. When applied to hearing aids, AI enables hearing aid devices to:
Distinguish speech from noise (already available today)
Adjust sound settings in real time (already available today)
Anticipate and adapt to changing environments (already available today)
Learn the wearer’s listening preferences (coming soon)
Tune to a familiar voice for optimum speech in noise experience, similar to the way that Amazon Alexa can recognise a person's voice, e.g. your partner's (coming soon)
Rather than simply amplifying all sounds based on the hearing loss profile of the patient, AI-powered hearing aids analyse each listening situation and optimise sound dynamically — much like how our brains naturally filter and interpret what we hear.
Items 1 to 3 in the above list are already implemented in AI- and DNN-enabled hearing aids, such as Phonak Infinio Sphere i90, Starkey Omega AI 24, and Oticon Intent 1R. These hearing aids distinguish speech patterns from the background noise and improve speech intelligibility in real time. The processing is live and adapts to the changing soundscape. If you have ever tried any of these hearing aids, you may have noticed that in certain situations, you may notice that the speech of a person in distance sounds like robotics.
This is the extent of the effort that the AI algorithm is trying to recognise human speech in the received sound and bold it against the background sound. In the presence of a loud noise in the background, again the aids do their best to recognise human voice patterns and amplify them against the background noise. Depending on your hearing loss and the fitting strategy, the outcome can be from natural speech to a speech that sounds a bit robotic, but the intelligibility noticeably improves.
At the time of this report best AI-powered hearing aids can improve the signal-to-noise ratio (SNR) with up to 10 dB (see my full report on Phonak Infinio Sphere here), while the hearing aids without the AI can only improve the SNR up to about 6 dB.
Adaptive setting - Some of today's hearing aids can notice and learn the wearer's setting preferences, but this ability is quite limited. This is done in conjunction with the geo-mapping feature in the wearer's mobile phone. I can name Starkey Omega AI if I must name an example. On the myStarkey App, you can preset a setting for a specific location, e.g. a pub that you usually visit. When you get back to that location, the hearing aid automatically loads the setting you have saved for that location. It is a cool feature. This feature is not available in all the AI-powered hearing aids, e.g. Oticon Intent. The next level location mapping for the hearing aids is that the hearing aid learns that you change the setting whenever you are in a certain geolocation and adapts to it after a certain number of visits. This feature is not available in any hearing aid at the time of this report.
The use of an accelerometer in hearing aids has enabled engineers to adapt the settings of the hearing aids to the situation and position of the patient. For example, the Phonak Infinio Sphere switches to a wider or even an omnidirectional microphone when you are walking, as the device assumes that while walking, you might be speaking to a person on your left or on your right. But when you stop walking, it switches to a directional mode so you can hear the speaker in front of you, again assuming that you are talking to a person in front of you when you are not walking. In another hearing aid model, Signia Pure Charge and Go recognises the speech sound patterns in real time and can process the speech in a separate path compared to noise, so it stays (or say focused) on the speaker even if he or she is moving in front of you (more in this article).
Voice recognition - The ability of hearing aids to recognise a specific voice pattern, e.g., your partner's voice, and adapt to a specific setting is another feature yet to come in hearing aids. This feature uses AI and is available to some degree in our environment today. A good example is Amazon Alexa, which can be trained for a specific voice pattern and recognises that voice for future instructions, shopping habits, favourite quantities to order and so on. But this feature is yet to be implemented in the hearing aids.
Real-world capabilities of AI hearing aids
AI enables hearing aids to do far more than traditional devices. Here are the key functions users benefit from:
Automatic Environment-Based Adjustments - AI hearing aids continuously analyse incoming sound and automatically tweak settings — reducing background noise, boosting speech clarity, and optimising volume without manual input. Whether you’re at a noisy café or a quiet meeting, the system adapts instantly for the best listening experience. Most hearing aid manufacturers call this process a sound or environment classifier. Their classifiers are getting better and better through AI. The classifiers are upgraded via firmware updates in the hearing aids, so at the time of this article, the hearing aid classifiers are not updated in real time. Please note that the environmental classifiers also exist in hearing aids without AI, but they are optimised for better accuracy in AI-powered hearing aids.
Context-Aware Mode Switching - Some AI hearing aids can recognise specific locations and automatically switch to custom hearing profiles you’ve pre-set for those places (e.g., home, office, favourite restaurant). This means your preferences are applied instantly whenever you arrive — no manual switching required after the initial settings. Again, this is not available in all hearing aid models and, at the moment, it requires an initial user-adjustment. Starkey Omega is one of the hearing aids that has a geolocation setting, which can be set up for different locations through the myStarkey App.
Prioritising Important Voices - Advanced AI systems can identify and enhance human speech, mimicking how the brain focuses in a crowd. This “cocktail party effect” replication helps users follow conversations more easily in challenging environments. This is the core AI-DNN processing of the AI hearing aids. Such an algorithm gets better with each firmware update, but is yet to be dynamic (discussed before higher up in this article). The hearing aids without AI do not have this capability.
Health and Wellness Monitoring - Today’s AI hearing aids can work as health wearables. They can track steps, activity levels, and other wellness metrics, offering insights into daily habits and broader health trends. Starkey and Phonak Infinio are the leaders in this front, but you can pretty much find health monitoring features in other makes of hearing aids, too. And honestly, a lot of healthy features were implemented in the hearing aids even before the introduction of AI, but admittedly, the health features are getting better with the help of AI, too.
Safety Features Like Fall Detection - Some devices include motion sensors and AI algorithms that can detect sudden falls and automatically send alerts to caregivers or family members — adding an important layer of safety for older adults or those with mobility challenges. Same as the health features, this feature was available in Starkey hearing aids way before the introduction of AI, but it is getting better and smarter through AI.
Real-Time Translation and Voice Control - For travellers and multi-lingual environments, certain AI hearing aid apps offer real-time translation into dozens of languages. The real-time translation in hearing aids is at a fancy stage. Only Starkey has a basic translation service in the myStarkey App that is similar to Google Translate on the Google platform, but in the hearing aid's app. It is not real-time, and it does not speak the translation in your hearing aids. I am looking forward to such a feature coming to the hearing aids soon. Others incorporate virtual assistant functions — much like Siri or Alexa — allowing users to adjust settings or ask questions using natural speech. For example, with a double-tap on your Phonak Infinio, you can activate Siri on your iPhone and communicate questions or instructions through the hearing aid to your mobile phone and to Siri.
Some of the above features (specifically features 4, 5 and 6) were there before the introduction of the AI in the hearing aids, but I'd say they are getting better through AI. Although health, fall-down and translation features are still a secondary function in the design of the hearing aids, I'd say a complementary feature.
Beyond Sound: Connectivity and Smart Features
Modern hearing aids deliver connectivity options, too. Many integrate seamlessly with smartphones via Bluetooth for direct streaming of phone calls, music, and media. Most hearing aid models even support hands-free calling, where the microphones on the hearing aids serve as the input source, keeping both hands free.
I would say that the AI has not done much on the connectivity side. So the hands-free calls and Bluetooth connectivity have already been part of the hearing aids in the past 5-10 years, even before AI started showing up in the processing side.
What the Future May Hold
AI in hearing aids continues to evolve rapidly. Industry innovation is moving toward even more powerful on-device processing, improved speech recognition in real time, and deeper integration with everyday technology. Here are the areas that I think AI could develop in future hearing aids:
Better speech intelligibility through voice recognition
Longer battery life through more efficient AI-DNN processing
Real-time translation
Connectivity to smart homes
Integrated assistants for communication, health, and daily life.
What are the best AI-powered hearing aids on the UK market?
As of the date of this article following hearing aids have AI processing:
Phonak Infinio Sphere Ultra - In late 2025, Phonak updated the firmware of Phonak Infinio Sphere, and through this update, the Sphere hearing aids became more efficient in speech recognition and battery consumption. Phonak came late to the AI race in the hearing aids but introduced a product that proved to be better than Oticon Intent, the flagship of Oticon AI-driven hearing aids, which was introduced to the market in 2024. Phonak Sphere has two processors; the ERA chip is in constant use until in a noisy situations that it triggers the DEEPSONIC chip, the one that does the AI-DNN processing. Please read our full review on Phonak Infinio Sphere in this article.
Starkey Omega AI - Starkey claimed to have the AI processing in their hearing aids since the generation called Starkey Evolve AI, but the AI processing was felt (my own experience) only from Starkey Edge, and then in their latest model, Starkey Omega AI. The Starkey Omega AI is always in the AI mode. You can activate an enhanced AI mode called "Edge Mode" by double-tapping or through the myStarkey App for better noise management and speech intelligibility.
Oticon Intent - Oticon Intent has been Oticon's flagship in AI-powered hearing aids, and it was the best hearing aid in terms of speech intelligibility in noise before enterance of Phonak Infinio Sphere. The AI processing in Oticon Intent is always on and does not need activation by the user. Some sound adjustment can be done through the Oticon Companion app, which is not the most sophisticated hearing aid app on the market. It does not offer geolocation mapping for hearing aid settings and does not offer translation.
Oticon Zeal - Oticon Zeal is the first rechargeable and Bluetooth CIC hearing aid with AI processing. The hearing aid uses the same SIRIUS chipset that is used in Oticon Intent and is Auracast activated, which makes it future-proof in terms of connectivity. Please see our full review of Oticon Zeal hearing aids here. Same as Oticon Intent, the AI mode in Oticon Zeal is always on.
Resound Vivia - The activation of AI in Resound Vivia is manual, as the user can do it through the Resound hearing aid app.
Does the AI and DNN processing in hearing aids work when the patient is fitted with one-sided hearing aids?
I have some good news for patients with single-sided deafness (SSD) who only need one hearing aid. The directionality in conventional hearing aids is the most effective technique in improving the hearing of speech coming from the front of the wearer. But when the patient is completely deaf in one ear and has a hearing aid in the better ear (with hearing loss), there is no directionality in effect simply because there is only one hearing aid. The AI and DNN processing to improve the speech intelligibility is an independent process and is fully effective even with only one hearing aid fitted for the patient. Please keep in mind that there can still be an improvement in SNR and speech intelligibility when the hearing aid is AI-powered and is fitted binaurally for the patient, as the improvement due to directional processing and the AI will be combined. But, this is a fact that a single-sided fitted hearing aid with AI-DNN processing is certainly more effective than a single-sided fitted hearing aid without AI-DNN processing.
Which hearing aid has the best Ai-DNN processing?
At the time of this article, Phonak Infinio Sphere Ultra, Oticon Intent and Starkey Omega AI are the best AI-powered hearing aids, with Phonak Sphere offering the best speech in noise intelligibility through AI-DNN processing. Please visit the related links at the top of this article for our reviews and comparison between these three hearing aids.
Is the AI processing always on in the AI-powered hearing aids, or does it need to be activated?
The AI activation is automatic in some hearing aids, and in other models, it can always be on. For example, the AI processing is always on in Oticon Intent and Oticon Zeal hearing aids, as well as Starkey Omega AI. The AI processing switches on automatically in Phonak Sphere when the user is in a noisy environment. This is because the AI processing in Phonak Sphere is much heavier than the other products in the market, hence it takes substantial battery power. Therefore, the manufacturer designed the hearing aid in a way that it is not activated when there is not much loud noise.
My hearing aid does not have AI processing, so how does it make speech understanding better in a noisy environment?
Premium hearing aids each have different sound processing strategies to control the noise and improve hearing speech in a noisy environment. Please refer to our hearing aid reviews or get in touch through a telecare appointment on HearingNow for further details on each model.
My hearing aids are not AI-driven; should I upgrade to those with AI and DNN processing?
Upgrading your hearing aids is a personal decision. My advice to my patients is always to try the latest models to see if they can provide a meaningful improvement in their hearing and speech understanding in a noisy environment, first, before buying one. Other models of hearing aids without AI and DNN processing still help hearing speech in a noisy environment, albeit not as well as the AI-powered hearing aids. But also, you need to consider that other factors, such as the acoustics and the fit of your hearing aids by your audiologist, might have a bigger impact on your hearing experience. Often, a fresh hearing test and adjustment with REM improves your hearing without the need to buy new hearing aids.
Which model of in-the-ear hearing aids has AI processing?
At the moment, only Oticon Zeal, Oticon Own and Starkey Omega CIC/IIC models have the chipset with AI-DNN processing onboard. Please see this article for a comparison between Oticon Zeal and Starkey Omega AI CIC hearing aids.
Conclusion
AI hearing aids represent a major shift from traditional amplification devices to intelligent, adaptive systems that learn, respond, and evolve with the user. By combining advanced machine learning, neural networks, and smart sensors, these devices deliver more natural, personalised, and effective hearing support than ever before.
Nevertheless, there is a lot of room for improvement in the AI and DNN processing used in the hearing aids. My best advice to patients is to complete their research and try a couple of the AI hearing aids before committing to purchase one. Also My best advice is to attend a local clinic for fitting as the fitting and acoustics of the hearing aids are equally, if not more, important than the technology use the the sound processing.
References
Alsaleh, Hadeel, Mai Nguyen, and Abhishek Bhushan Singhal. "Drivers of artificial Intelligence–powered hearing aids by individuals: an in-depth qualitative investigation." Journal of Enterprise Information Management (2026): 1-22.
Fabry, David A., and Achintya K. Bhowmik. "Improving speech understanding and monitoring health with hearing aids using artificial intelligence and embedded sensors." Seminars in Hearing. Vol. 42. No. 03. Thieme Medical Publishers, Inc., 2021.
Schum, Donald J. "Artificial Intelligence: The new advanced technology in hearing aids." Audiology Online (2004): 14-06.
Zou, Calvin. "Exploring the usage of AI in hearing aids to improve their functionality." Scholarly Rev. J (2025).
Umashankar, Abishek, et al. "APPLICATIONS OF ARTIFICIAL INTELLIGENCE IN HEARING AIDS AND AUDITORY IMPLANTS: A SHORT REVIEW." Journal of Hearing Science 11.3 (2021).
Questions?
Please get in touch with a free telecare consultation if you have any questions.




























Comments