Visual influences on auditory speech perception
Multisensory research has demonstrated that observing lip movements is critical for phoneme acquisition during development as well as when auditory information is degraded by environmental noise or auditory deficits. This line of research is supported by our current NIH K99/R00 grant support. Our lab’s model of multisensory speech perception proposes that visual information facilitates speech processing through multiple distinct anatomical and functional networks integrating different types of information across the senses, including (1) the use of visual contextual information associated with auditory speech signals (e.g., the gender and identity of the speaker), (2) the use of preparatory lip motion predictive of the timing of phonemic onsets, (3) the use of lipreading information, as well as (4) the use of low-level arousal mechanisms.
Synesthesia is a neurological phenomenon in which individuals are born with additional links between their senses. Some experience black numbers and letters as having colors (e.g. 2 may look blue or have the feeling that it should be red), or sounds may elicit colors or tastes — synesthesia can theoretically link up any two senses. These experiences are relatively common (2-4% of the population have one or more forms) and possessing synesthesia is not associated with any disorders.
We are currently conducting research on the relationship between synesthesia and more typical multisensory experiences. If you believe you may have synesthesia and are interested in participating in research, please contact us at email@example.com