{"id":1003,"date":"2022-03-18T16:40:27","date_gmt":"2022-03-18T23:40:27","guid":{"rendered":"https:\/\/larsonaudiology.com\/?p=1003"},"modified":"2022-03-18T16:40:47","modified_gmt":"2022-03-18T23:40:47","slug":"how-oticons-research-is-improving-hearing-aids","status":"publish","type":"post","link":"https:\/\/larsonaudiology.com\/how-oticons-research-is-improving-hearing-aids\/","title":{"rendered":"How Oticon’s Research Is Improving Hearing Aids"},"content":{"rendered":"\n

Oticon has been at the forefront of hearing loss research and hearing aid technology since their inception in 1904. Their newest product, Oticon Opn S<\/a> is the first hearing aid proven to help the brain organize sounds. With the use of groundbreaking EEG research, independent scientists found that Oticon Opn S can make it easier for users to follow conversations.<\/p>\n\n\n\n

Continue reading to learn more about this revolutionary hearing aid.<\/p>\n\n\n\n

What Is EEG Research Method?<\/h2>\n\n\n\n
\"audiologist<\/figure><\/div>\n\n\n\n

Electroencephalography (EEG) is a novel technique used by researchers to measure the brain\u2019s ability to track speech that the listener is paying attention to over a specific period of time.<\/p>\n\n\n\n

Through the use of electrodes placed on the scalp and mounted to an elastic cap worn by participants, this research method measures electrical activity generated by the brain. While similar to auditory brainstem response (ABR), which is used to measure the response of the brain to rapid auditory stimuli, this technique measures the brain\u2019s response to attended speech during a listening activity.<\/p>\n\n\n\n

The Research<\/h2>\n\n\n\n

The study<\/a> involved 22 participants. With an average age of 67, all participants were experienced hearing aid users with mild to moderate hearing loss.<\/p>\n\n\n\n

The participants were fitted with Oticon Opn S 1 miniRITE hearing aids and the EEG electrodes were attached to their scalps. They were seated in a listening booth and instructed to pay attention to one of two speakers in the room. In addition to the target speaker they were instructed to focus on, there were four loudspeakers presenting a variety of noise.<\/p>\n\n\n\n

Each test began with voices coming out of the two target speakers. Five seconds later, a series of news clips were played from the other speakers. Following the test, the participants were asked a question about the what the voice was saying from the specific speaker they were instructed to pay attention to. This test protocol mimics that of a real-world conversation in a noisy environment, such as TASTE!<\/a>, with one primary speaker to focus on, one speaker to ignore and background noise that needs to be suppressed.<\/p>\n\n\n\n

The Results<\/h2>\n\n\n\n

The researchers found that there was a clear distinction in terms of attention tracking between the attended speech and the unattended speech. This demonstrates how the brain is able to organize sounds based on relevance.<\/p>\n\n\n\n

Based on the strength of the EEG signals, the researchers determined that when the OpenSound Navigator was switched on, users saw:<\/p>\n\n\n\n