U bioengineers discovered that the brain depends on vision more than hearing. In other words, the brain alters what people hear following what people see.
The researchers conducted this study in order to better understand how the brain perceives everyday language. They linked auditory signals in the brain to see if there was a difference between what people said they heard and what they actually heard. The results showed that vision is influencing the hearing part of the brain and changing people’s perceptions of reality.
“Additionally, the researchers succeeded to use this new understanding to improve control for a brain-computer interface that helps blind or deaf people see and hear,” said Elliot Smith, an author of the study and a bioengineering and neuroscience graduate student.
The researchers conducted the study on epilepsy patients that needed part of their brain removed to help cure their epilepsy. Neurologists recorded from electrodes implanted inside the patients’ skulls in order to know which part of the brain to take out. The researchers also recorded from these electrodes while the patients watched videos of a person speaking four syllables: “ba,” “ga,” “va” and “tha.” The patient then had to tell them what he or she heard.
“Sometimes we would switch the audio and the video and the patient would perceive an illusion in which the patient hears something that is different from what is coming out of the speaker during the video,” Smith said.
This research tried to explain the origin of the McGurk effect, which explains the relationship among vision, hearing and the brain. The brain tries to integrate between vision and hearing, but visual cues dominate sound cues when they are slightly different. In short, the effect explains that the brain creates new sound if vision and hearing do not match. For example, when a person watches someone saying “ga,” but the actual sound heard was “ba,” the listener hears “da,” the middle sound between “ga” and “ba.” However, its origin has been elusive even if the McGurk effect has been observed for decades.
“We’ve shown neural signals in the brain that should be driven by sound are being overridden by visual cues that say, ‘Hear this!’” said Bradley Greger who worked at the study and has since moved to Arizona State University, “Your brain is essentially ignoring the physics of sound in the ear and following what’s happening through your vision.”
In addition, the new findings helped the researchers to understand the language process in humans, especially the development in infants’ brains when they are learning language, which includes connecting the sounds and lip movement.
“The researchers also become able to sort out how language processing goes wrong when visual and auditory inputs are not integrated correctly such as in dyslexia with these findings,” Gregor said in the release.