Humans use facial expressions to communicate. Two of the most important types of facial expressions are mouth movements and eye movements. Yet, we know little about the brain mechanisms for understanding facial expressions. To solve this mystery, Lin Zhu, a BCM M.D./Ph.D. student working in Dr. Michael Beauchamp’s laboratory, is using the 3 tesla scanners in the Center for Advanced MRI.
In her studies, Lin is showing volunteers mouth movements and eye movements and measuring their brain activity with blood oxygen level dependent functional magnetic resonance imaging (BOLD fMRI).
Her studies have shown that different parts of the brain, especially the superior temporal sulcus, prefer mouth movements or eye movements. Interestingly, the parts of the brain that respond strongly to mouth movements also respond strongly to listening to stories. This suggests a link between the neural structures that process visual information and those that process auditory information, a phenomenon known as multisensory integration. This knowledge will be very important for helping patients who have trouble communicating, such as children with autism spectrum disorders.