Former Ph.D. student Cecilie Møller, from the Center for Music in the Brain (MIB) and the Psykologisk Institut at Aarhus University, can now call herself a post-doc after the defense of her dissertation on the interplay between hearing and sight.
The world is a multimodal place and different simultaneously active sensory systems contribute to our perception of it. The human senses – or sensory systems – are always active at the same time, so that impressions we think are activating one single sense are often strongly influenced by information from other sensory experiences. If, for example, we listen to someone speaking, we also use our sight to understand what is being said. At the basic research center MIB, post-doc Cecilie Møller recently defended her Ph.D. project, where she examined the interplay between the sense of sight and the sense of hearing, with a focus on our ability to distinguish small differences in tones.
“Sounds, herein music, consist of different elements, including differences in tone height. It is the tone heights that make the foundation for melodies. For example, if you are to tune a guitar solely by hearing or if you have to assess if somebody is singing off-key, you must be good at differentiating between nuances in tone height,” said Møller.
The traditional research on perception examines one sense at a time – also called “uni-sensory perception” – a practice that was a necessary first step toward achieving valuable knowledge about sensory receptors and nerves’ function. But according to post-doc Møller, a description of perceptual processes should include a review of the interplay between hearing and sight.
“Our senses interact and cooperate to give us a precise conception of the world around us. Characteristic for smell, taste and our tactile sense is that the object that makes the basis for such sensory experiences must be close to us. The sight and hearing senses separate themselves because they are concerned with what occurs further away as well. Most of our sensory experiences originate from objects at a distance, and hearing and sight, therefore, work as especially compatible co-workers.” A lot of research has shown how most of us associate a high tone with an object placed in a high vertical position rather than a low one,” Møller explained.
In her project, Møller examined the advantages of multisensory processing compared to uni-sensory processing, followed by a study of the individual differences in how visual information improves the human ability to detect small changes in tone height.
Her results stem from experiments with 49 test subjects, one-third of whom were professional musicians with, particularly well-developed hearing. One of the experiments tested the individual’s ability to differentiate between different tone heights. The participants were placed in front of a computer with a headset and were asked to push the keyboard when they registered a difference in tone among a series of similar tones. With the change of tones as the audio part of the experiment, the participants also experienced a visual dimension in terms of a circle placed on a computer screen. Sometimes the circle changed its position on the screen alongside the tonal changes.
Here the results showed that those participants who did not have well-developed hearing benefited from the visual information more than those who had a well-developed hearing for music. The advantages of multisensory compared to uni-sensory processing were evident in the shape of improved sensitivity – which means an increased ability to hear the tonal differences – when the tonal differences occurred simultaneously with the position of the circle on the screen changed, as opposed to occurrences when the position did not change.
Another experiment applied magnetoencephalography (MEG) – an imaging method to map brain activity – to show the same pattern at an early processing stage in the brain’s primary hearing cortex. That way, Møller not only confirmed previous studies showing that sight contributes to our perception of sound but also presented the connection between an individual test subject’s auditory sensitivity and the degree of influence from the visual information they received on the neural level as well.
“We found out that the participants that were already good at differentiating between small tonal differences under solely auditory experimental conditions benefited less from the visual cues they received in the audiovisual condition than the ones who were less good at detecting tone differences,” Møller explained. She added:
“The most important point that can be extracted from my Ph.D. project is that there is a difference in how much sight affects hearing with different people and that this difference actually can be described and even predicted based on knowledge about their individual auditory sensitivity.”
Cecilie Møller is now employed as a post-doc at the Center for Music in the Brain and with the knowledge that her Ph.D. project has uncovered, she continues to do basic research on individual differences in perceptual processes. In the future, Møller hopes to use her knowledge in a new project about the connection between children’s musical and linguistic development.