Why do some people see yellow when they hear a sound? How do we adjust our body-image to fit the boundaries of objects we use like our cars? How does knowing the identity of a speaker and what their voice sounds like help us hear them better at a party?
The Multisensory Perception Lab studies how information from one sensory system influences processing in other sensory systems, as well as how this information is integrated in the brain. In specific, we investigate the mechanisms underlying basic auditory, visual, and tactile interactions, synesthesia, multisensory body image perception, and visual facilitation of speech perception. Our current research examines multisensory processes using a variety of techniques including psychophysical testing and illusions, fMRI and DTI, and electrophysiological measures of neural activity (both EEG and ECoG).
Our electrocorticography (ECoG) recordings in particular are a unique resource that allows us to record neural activity directly from the human brain from clinically implanted electrodes in patients. These recordings are collected while patients perform the same auditory, visual, and tactile tasks that we use in our other behavioral and neuroimaging studies, but ECoG measures have millisecond temporal resolution as well as millimeter spatial precision, providing unparalleled information about the flow of neural activity in the brain.
The Multisensory Perception Lab is accepting graduate students to start in the Fall of 2017. Please contact Dr. David Brang (Lab Director) at firstname.lastname@example.org if you have any questions.