Researchers find neurons work as a team to process social interactions

Posted by
Spread the love
Earn Bitcoin
Earn Bitcoin

Researchers have discovered that a part of the brain associated with working memory and multisensory integration may also play an important role in how the brain processes social cues. Previous research has shown that neurons in the ventrolateral prefrontal cortex (VLPFC) integrate faces and voices — but new research, in the Journal of Neuroscience, shows that neurons in the VLPFC play a role in processing both the identity of the “speaker” and the expression conveyed by facial gestures and vocalizations.

“We still don’t fully understand how facial and vocal information is combined and what information is processed by different brain regions,” said Lizabeth Romanski, PhD, associate professor of Neuroscience at the Del Monte Institute for Neuroscience at the University of Rochester and senior author of the study. “However, these findings confirm VLPFC as a critical node in the social communication network that processes facial expressions, vocalizations, and social cues.”

The VLPFC is an area of the brain that is enlarged in primates, including humans and macaques. In this study, the Romanski Lab showed rhesus macaques short videos of other macaques engaging in vocalizations/expressions that were friendly, aggressive, or neutral. They recorded the activity of more than 400 neurons in the VLPFC and found that individually, the cells did not exhibit strong categorical responses to the expressions or the identities of the macaques in the videos. However, when the researchers combined the neurons as a population a machine learning model could be trained to decode the expression and identity in the videos based only on the patterns of neural activity, suggesting that neurons were collectively responding to these variables. Overall, the activity of the population of VLPFC neurons was primarily dictated by the identity of the macaque in the video. These findings suggest that the VLPFC is a key brain region in the processing of social cues.

“We used dynamic, information-rich stimuli in our study and the responses we saw from single neurons were very complex. Initially, it was difficult to make sense of the data,” said Keshov Sharma, PhD, lead author on the study. “It wasn’t until we studied how population activity correlated with the social information in our stimuli that we found a coherent structure. For us, it was like finally seeing a forest instead of a muddle of trees.” Sharma and Romanski hope their approach will encourage others to analyze population-level activity when studying how faces and voices are integrated in the brain.

Dissecting the brain’s audiovisual processing circuits

Understanding how the prefrontal cortex processes auditory and visual information is a cornerstone of the Romanski lab. This process is necessary for recognizing objects by sight, as well as sound, and is required for effective communication. In previous research, the Romanski Lab identified the VLPFC as an area of the brain responsible for maintaining and integrating face and vocal information during working memory. This body of research points to the importance of this brain region within the larger circuit that underlies social communication.

“Knowing what features populations of neurons extract from face and vocal stimuli and how these features are typically integrated will help us to understand what may be altered in speech and communication disorders, including autism spectrum disorders, where multiple sensory stimuli may not combine optimally,” Romanski said.

Additional authors include Mark Diltz of the University of Rochester Medical Center, Theodore Lincoln of Astrobotic Technology Inc., and Eric Albuquerque of the University of Miami School of Medicine. This research was supported by the National Institutes of Health, the Schmitt Program for Integrative Neuroscience through the Del Monte Institute for Neuroscience, and the University of Rochester Medical Scientist Training Program (MSTP).