The young man, a military veteran, had a history of significant noise exposure and limited use of hearing protection.
“He and his wife were at their wits’ end with his difficulty understanding speech in restaurants and at family parties—but his audiograms and ENT exams were consistently normal,” recalled Kaitlin Anderson, AuD, a clinical audiologist at Northwestern Medicine Delnor Hospital in Geneva, IL.
When Anderson subsequently conducted a speech-in-noise test, the results provided some much-needed validity to the patient’s troubles and frustrations. He had a mild bilateral signal-to-noise ratio loss. Or, as it is more commonly known, hidden hearing loss.
The term “hidden hearing loss” is not without its complications. On the surface, it presents as a person who has trouble understanding speech in noisy situations but whose audiometric thresholds are normal.
From there, the complexity grows.
“Hidden hearing loss has two meanings, which is confusing,” said David Moore, PhD, the director of the Communication Sciences Research Center at Cincinnati Children’s Hospital Medical Center in Ohio.
“A very broad meaning is every aspect of hearing impairment that is not captured by the current clinical examination. Some prominent examples include subclinical hearing loss (e.g., audiogram pure tone averages [PTAs] between 10 and 20 dB HL and extended high frequency [EHF] hearing loss above 8 kHz) and impaired speech-in-noise hearing. A much narrower, specific meaning is reduced functional connection between the ear and the brain, a condition called synaptopathy.”
Moore prefers the first meaning because of its inclusivity. A wide interpretation of hidden hearing loss encompasses not only synaptopathy and other problems with an established functional effect on hearing, but also central auditory disorders. (Indeed, a perspective column in The ASHA Leader two years ago asserted that hidden hearing loss is likely most often due to central auditory processing disorders, which, according to the authors, are caused by deficits in the central nervous system’s processing of auditory stimuli.1 For his part, however, Moore has come to believe assumptions that automatically link auditory processing disorders to the central auditory nervous system are unfounded.2 But that’s another discussion.)
Two recent studies by Moore and colleagues reflect the importance of the broader meaning of hidden hearing loss, Moore explained. One showed that elementary school-age children with audiogram PTAs between 10 and 15 dB HL were slipping behind peers with better hearing in important skills, such as speech-in-noise hearing and reading.3 The other linked difficulty hearing in noisy environments with EHF hearing loss is in young adults with otherwise normal hearing.4 “The data suggest,” researchers wrote, “that EHF hearing is a long-sought missing link between audiometry and speech perception.”
Now the narrower view. Cochlear synaptopathy is a particular form of hiddenhearing loss that reflects—instead of damaged hair cells in the inner ear typical with hearing loss—damaged auditory neurons. In cases of synaptopathy, input to the brain is hampered.
“It’s a condition where you lose neurons that are very important for the coding of sound (including speech) in noisy environments,” said Stéphane Maison, PhD, CCC-A, who studies cochlear synaptopathy at Massachusetts Eye and Ear (Mass. Eye and Ear) and Harvard Medical School in Boston. “In short, you can start to lose intelligibility before losing sensitivity (‘I can hear, but I don’t understand.’).”
Maison was the lead investigator of a study that generated plenty of press in 2016 for being the first to present evidence that cochlear synaptopathy occurs in humans (previously, evidence was limited to studies involving animals). Via electro-cochleography, the researchers showed a decreased response of auditory nerves to sound stimulation in a group of college students regularly exposed to loud music but who tested normal on standard hearing tests.5
Still, “the best evidence that it does affect humans comes from histopathological studies of human temporal bones,” Maison noted. Highly variable human biological responses, he explained, make detecting synaptopathy difficult.
“We’re working very hard to come up with a diagnosis in humans,” Maison said.
Earlier this year, Maison’s colleagues at Mass. Eye and Ear/Harvard Medical School published evidence suggesting a pair of objective biomarkers of brain function can flag hidden hearing loss. Their investigation used electroencephalography (EEG) to measure electrical signals at the surface of the ear canal to gauge the brain’s processing of the earliest stages of sound.6 In addition, they evaluated pupil diameter changes—a measure of cognitive effort—as participants listened to someone speak amid babbling in the background.
By looking at measures of ear canal EEG and changes in pupil diameter, researchers were able to identify which among nearly two dozen participants struggled to follow speech amid background noise and which did not.
“Combined, these tests can account for nearly 80 percent of the variability in multitalker speech intelligibility, a significant improvement over current clinical tests of hearing, which do not detect these changes at all,” said lead study author and investigator Aravindakshan Parthasarathy, PhD. “These tests could be implemented in most hospital hearing clinics, providing patients with an objective measure of their perceptual difficulties and giving hearing health providers an objective readout for their therapeutic drugs or devices.”
Researchers are focusing on translating the tests for use in hearing clinics. In addition to their apparent ability to detect hidden hearing loss in living humans, the advantages of using the biomarkers identified in the study include that they do not require specialized equipment or marathon-length measurement sessions.
For the 10 percent or so of visitors at the Mass. Eye and Ear hearing clinic who appear to fit the profile of hidden hearing loss, the key markers could offer clarity.
“Our tests show promise in being part of the next-generation clinical battery targeting the silent majority—individuals who struggle to follow conversations in noisy social environments but are not identified by any existing clinical tests,” said Parthasarathy.
For the time being, audiologists continue to face patients struggling with hiddenhearing loss with limited diagnostic tools.
“In our facility, clients typically come in after searching and assessment elsewhere,” said Kathleen Kelliher Ward, AuD, CCC-A, the audiology service coordinator at Loyola Clinical Centers, part of Loyola University Maryland in Baltimore. The clinic specializes in assessing children who have difficulty learning as well as adults with head injuries and strokes.
When she sees their audiogram results, Ward is careful not to remark that their hearing is fine.
“When a person is told their hearing is fine,” she said, “it implies they are not believed by the health care professionals assessing them.”
Instead, she gets to work conducting a comprehensive hearing evaluation. The assessment includes otoscopy, tympanometry, a pure-tone test of the octave band frequencies of speech, speech reception threshold, a word recognition in quiet test, and, finally, a Words-in-Noise (WIN) test. Ward prefers the WIN test because it classifies results in terms of functional ability: within normal limits, mild impairment, moderate impairment, severe impairment, and profound impairment. The whole process takes less than an hour.
“We present the information to the patient, comparing their function in quiet to the scores in noise,” she said. “Functional ability levels seem to be more patient-friendly than just providing percentage scores.”
Anderson, too, begins with a comprehensive evaluation that includes otoscopy, tympanometry, acoustic reflexes (ipsilateral and contralateral), diagnostic distortion product otoacoustic emissions, and air/bone/speech reception threshold/word recognition score testing with extended high-frequency audiometry if possible, followed by the Quick Sentences-in-Noise (QuickSIN) test.
She may recommend an auditory brainstem response (ABR) test (a decreased wave I amplitude would signal a decrease in auditory nerve fibers suggestive of synaptopathy, while auditory neuropathy may appear in both waves I and V). But Anderson pursues the ABR test only if the patient is willing.
“If the patient arrives and he or she is completely debilitated by difficulty discerning speech in noise, our ENT’s exam is normal, my audiogram is normal, but my QuickSIN results are very abnormal, I am more inclined to counsel on communication strategies and recommend a trial with amplification immediately,” she said. “These people have difficulty hearing in background noise regardless of whether or not they have true cochlear synaptopathy.”
Along with the current lack of standard tests to diagnose the exact etiology, hidden hearing loss has no definitive treatment. Instead, audiologists individualize recommendations based on the patient’s difficulty and life situation.
First comes confirmation. As was the case with the young veteran and his wife, verification that the patient does not have perfect hearing can ease built-up tension and set the stage for moving forward.
“In this scenario, simply explaining the results to him and his wife, counseling on communication strategies and amplification options, as well as validating their concerns, resulted in a huge sigh of relief,” Anderson said.
“It is important to affirm for patients that the difficulty is real and to provide intervention accommodations, strategies, and—when needed—technology to manage the difficulties they are experiencing,” agreed Ward.
Communication strategies for hidden hearing loss are similar to those for hearing impairment: face the speaker, stay close, make use of facial features and hand gestures, and employ active listening. Whenever possible, choose environments with carpeting and soft surfaces over those with hardwood floors or similar surfaces, where sound reverberates. When dining out, ask for an out-of-the-way table or, better yet, a padded booth, and face away from the kitchen or other noisy areas. And in academic settings, sit up front, close to the speaker, and avoid vents, fans, and other noise-making equipment.
In some cases, amplification can help. That might mean hearing aids, or—especially in academic, lecture-type settings—it might mean a remote microphone worn by the speaker and a receiver integrated into headphones or hearing aids for the listener. Workers in settings with a lot of background bustle sometimes benefit from caption telephones, which provide real-time transcription of phone conversations.
“Families, teachers, caregivers, coaches, and friends need to be made aware of the difficulty,” Ward advised. “These communication partners can then decrease background noise when communicating, or at least understand the potential for misunderstanding related to the disability associated with hidden hearing loss.”
Patients may need a little encouragement to speak up about their condition, especially since many have lived with it for some time before finally receiving confirmation that their difficulty is real.
“Hearing loss of any kind can be very isolating, and tends to shut us out from the world around us,” said Anderson. “Our ability to communicate effectively with our friends and family and interact with our environment is what keeps us engaged in life. I think that, in itself, is reason enough to pursue treatment.”
3. Ear Hear. 2019 Oct 16. doi: 10.1097/AUD.0000000000000802. [Epub ahead of print].
4. PNAS November 19, 2019 116 (47) 23753-23759; first published November 4, 2019. https://doi.org/10.1073/pnas.1903315116.
5. PLOS One. 2016;11(9):e0162726.
6. eLife. 2020;9:e51419.