Biologically, our (human) ears are made to talk to each other – to identify sounds, to lower interfering sounds, prioritise vital sounds and locate the source of a sound. This sound apportioning happens via a neural reflex that links the cochlea of each ear via the brain’s auditory control centre – to balance hearing between each ear for sound discrimination.
Pairing Hearing Devices To Process Together
Paired digital hearing aids already synergise to correlate and attune sounds detected by tiny microphones. Now, cochlear implants are getting this binaural (both-ear hearing) option through research in Canada. The goal is for cochlear implants to harmonise for their human wearers to clarify their local soundscape, particularly in noisy settings with multiple sounds.
As researcher Matthias Dietz says, this capability will happen within the next five years.
The benefits in having the signal processing on the one device knowing what the signal processing on the other [cochlear implant] device is doing – it’s not rocket science. It’s going to happen.
Hearing-aid wearers have directional microphones to tune into a person they are talking to, with ambient noise ‘contained’ as needed. Like a satellite dish that swings around to detect radar, these microphones beam toward a conversational partner as they speak – with sound transmitted between the ears using wifi, like the brain’s natural sound-processing function.