A blue drawing on a dark background showing a brain inside the outline of a persons head with pinpoints of light to indicate activity.

Researchers discover advanced language processing in the unconscious human brain

Baylor College of Medicine researchers have found that the human brain is capable of sophisticated language processing while in an unconscious state from general anesthesia. The findings, published in the latest edition of Nature, challenge what we know about the role of consciousness and cognition, and could open new ways of understanding memory, language and brain-computer interfaces.

“Our findings show that the brain is far more active and capable during unconsciousness than previously thought,” said Dr. Sameer Sheth, professor and Cullen Foundation Endowed chair of neurosurgery and a McNair Scholar at Baylor. “Even when patients are fully anesthetized, their brains continue to analyze the world around them.”

Sheth, who is also a neurosurgeon at Baylor St. Luke’s Medical Center, and his collaborators first recorded neural activity from hundreds of individual neurons in the hippocampus, a part of the brain associated with memory, while patients were under general anesthesia during epilepsy surgery. Patients undergoing this type of surgery were sought after because it allowed researchers access to this particular part of the brain.  

Using Neuropixels probes, a technology which had not been used in this part of the brain before, the team collected data on how the brain processed sound and language without conscious awareness.

The study began with patients exposed to repetitive tones interrupted by an occasional different sound. Researchers found that hippocampal neurons could distinguish these unusual tones and that this ability improved over time, suggesting a form of learning or neural plasticity during anesthesia.

Researchers then moved on to conduct a more complex experiment where they played short stories to patients while recording neural responses. Surprisingly, the hippocampus demonstrated real-time processing of language. Neural activity showed the brain’s ability to differentiate parts of speech, such as nouns, verbs and adjectives, based on patterns of neuron firing.

Even more surprising, researchers found that neural signals could predict upcoming words in a sentence. 

“The brain appears to anticipate what comes next in a story, even without conscious awareness,” said Sheth, who is also Director of The Gordon and Mary Cain Pediatric Neurology Research Foundation Laboratories within the Duncan Neurological Research Institute at Texas Children’s Hospital

“This kind of predictive coding is something we associate with being awake and attentive, yet it’s happening here in an unconscious state,” said Dr. Benjamin Hayden, professor of neurosurgery and a McNair Scholar at Baylor. 

These discoveries suggest that cognitive functions such as language comprehension and prediction do not require consciousness. Instead, consciousness may depend on broader coordination across brain regions rather than activity within a single structure like the hippocampus. 

This activity also mirrors the predictive behavior seen in artificial intelligence (AI). The brain’s ability to predict upcoming words is similar to how large language models generate text. These findings help researchers understand how biological and artificial systems process information. This could be a step towards the development and refinement of new technologies for communication, such as speech prosthetics for individuals who are unable to speak.

“Can we use these signals to deploy and run a speech prosthetic for some of the parts of the brain that are damaged by stroke or injury? These are questions that we can now consider in relation to this part of the brain,” said Dr. Vigi Katlowitz, first author and a neurosurgery resident with Baylor. 

However, more research is needed. The findings are specific to one type of anesthesia and may not apply to other unconscious states such as sleep or coma. This study only looked at one brain region as well, and it is unknown how widespread these processes are across different regions of the brain. 

“This work pushes us to rethink what it means to be conscious,” said Sheth. “The brain is doing much more behind the scenes than we fully understand.”

Others who contributed to the study include: Eric R. Cole, Elizabeth A. Mickiewicz, Shraddha Shah, Melissa Franch, Joshua A. Adkinson, James L. Belanger, Raissa K. Mathura, Domokos Meszéna, Matthew McGinley, William Muñoz, Garrett P. Banks, Sydney S. Cash, Chih-Wei Hsu, Angelique C. Paulk, Nicole R. Provenza, Andrew J. Watrous, Ziv Williams, Alica M. Goldman, Vaishnav Krishnan, Atul Maheshwari, Sarah R. Heilbronner, Robert Kim and Nuttida Rungratsameetaweemana. See a list of affiliations in the publication

This project was funded in part by the National Institutes of Health (U01 NS121472), the McNair Foundation and the Gordon and Mary Cain Pediatric Neurology Research Foundation. This project was supported by the Optical Imaging & Vital Microscopy Core at the Baylor College of Medicine and by the McNair Foundation.

Find an Expert

Let our media specialists help you find an expert in health, medicine, education, research and patient care.

Learn More

Research at Baylor

Read more about research at Baylor College of Medicine in From the Labs.

Learn More

Media Queries

During business hours call:
713-798-4710

After business hours call:
713-775-6912