Oticon is using AI to help Kiwis hear this World Hearing Day

Today is World Hearing Day, and if you didn’t know that you should check your hearing because I told you like 3 times.

But on a more serious note, you should probably get your hearing checked because a lot of us have had significant hearing loss from being too close to the speakers at concerts and parties, or blasting the volume up on our headsets when we game.

As such we are not hearing what we could be and the only way to fix it is to get tested. Oticon even has an online test you can do so you don’t have to break level 3 rules if you are in Auckland.

You can try the test here!

Even cooler than finding out you are half deaf is tech to help out.

Oticon’s latest hearing aid is loaded with AI to make the sounds around you more natural and help take the strain off your brain.

Corey Ackerman can explain it better so read his quotes below, and if you still don’t get it go get your hearing tested anyway.

Corey Ackerman, Oticon New Zealand, said, “The earlier hearing loss is detected and treated, the less impact it can have on a person’s life. The ability to communicate well and feel confident in social environments is vital to keep people leading an active life, which has significant benefits for health and wellbeing.”

Oticon More, a world first hearing aid, trained with 12 million real-life sound scenes, has arrived in New Zealand. This new device, equipped with groundbreaking artificial intelligence (AI), was developed after research revealed people with hearing loss need access to all sounds for their brains to work in a natural way.

“Most people think we hear with our ears, but our brains are actually our main tool for hearing. Oticon More uses AI technology, a Deep Neural Network, to help the brain hear sound in a natural and effective way.

Traditional hearing aids can block out vital surrounding sound, but Oticon More scans and analyses a sound scene at 500 times per second allowing the brain to process key sounds, such as someone else speaking or a bird chirping, even in a noisy, crowded environment,” says Ackerman

Oticon More uses one of the most advanced technologies on its new hearing aid platform, a Deep Neural Network which has been trained with 12 million everyday-life sound scenes.

As a result, the hearing aid has learned to recognise all the varying types of sounds, their details and how they should ideally sound.

“When you limit what you can hear to just a single person speaking, which many hearing aids do, your brain is forced to work harder in an unnatural way, and you can be cut off from other conversations around you.

By helping the brain to process sound in a natural way, we can better help reduce the health and life problems associated with untreated hearing loss,” says Ackerman.

The device, which can be linked to compatible smartphones, also allows users to directly stream music and phone calls into their ears and even connects to the TV, computer and smart home devices with the use of additional accessories.

Compared with previous generation hearing aids Oticon More offers a clearer and more distinct contrast between sounds, something that conventional technology has never before been able to deliver.

“Hearing loss often forces people to avoid situations with too much noise, but Oticon’s progress in the use of AI is a quantum leap in creating natural, clear, complete and balanced sounds. We hope this advancing sonic technology will deliver greater freedom for many,” says Ackerman.

Comments are closed.

%d bloggers like this: