How the brain encodes speech

Posted by
Spread the love
New research from Northwestern Medicine and Weinberg College of Arts and Sciences has moved science closer to creating speech-brain machine interfaces by unlocking new information about how the brain encodes speech. Researchers have discovered how the brain controls speech production  in a similar manner to how it controls the production of arm and hand movements.They recorded signals from two parts of the brain and decoded what these signals represented. They found the brain represents the goals of what we are trying to say (speech sounds and the individual movements that we use to achieve those goals-how we move our lips, palate, tongue and larynx). The different representations occur in two different parts of the brain.e.

Earn Bitcoin
Earn Bitcoin

The could help people with other speech disorders, such as apraxia of speech, which is seen in children as well as after stroke in adults. In speech apraxia, an individual has difficulty translating speech messages from the brain into spoken language. Speech is composed of individual sounds- phonemes, that are produced by coordinated movements of the lips, tongue, palate and larynx.

Speech motor areas of the brain would have a similar organization to arm motor areas of the brain. The precentral cortex would represent movements (gestures) of the lips, tongue, palate and larynx, and the higher level cortical areas would represent the phonemes to a greater extent.

The precentral cortex represented gestures to a greater extent than phonemes. The inferior frontal cortex, which is a higher level speech area, represented both phonemes and gestures. Researchers recorded brain signals from the cortical surface using electrodes placed in patients undergoing brain surgery to remove brain tumors. The patients had to be awake during their surgery, so researchers asked them to read words from a screen.

After the surgery, scientists marked the times when the patients produced phonemes and gestures. Then they used the recorded brain signals from each cortical area to decode which phonemes and gestures had been produced, and measured the decoding accuracy.

The brain signals in the precentral cortex were more accurate at decoding gestures than phonemes, while those in the inferior frontal cortex were equally good at decoding both phonemes and gestures. This information support linguistic models of speech production. It will also guide engineers in designing brain machine interfaces to decode speech from these brain areas.

haleplushearty.org