The manner by which we process sensory information through touch, audition, and vision can be remarkably similar. Crossmodal perceptual correspondences may reflect common neural coding schemes employed by neuron populations in the sensory cortices – Analogous coding mechanisms may also facilitate information transfer between the senses. Additionally, shared neural systems may be recruited to support supramodal processing of tactile, auditory, or visual signals.
One of our lab’s primary research aims is to identify fundamental principles of sensory processing in humans. These principles can inform the development of models of perceptual function that are grounded in neurophysiology. Our studies address how each modality operates individually and cooperatively in processing spatial (e.g., shape) and temporal (e.g., frequency) information, using psychophysics, functional neuroimaging (fMRI), and noninvasive brain stimulation (TMS and tDCS). The lab’s immediate areas of interest cover frequency perception, sensory integration, and decision making.