The could help people with other speech disorders, such as apraxia of speech, which is seen in children as well as after stroke in adults. In speech apraxia, an individual has difficulty translating speech messages from the brain into spoken language. Speech is composed of individual sounds- phonemes, that are produced by coordinated movements of the lips, tongue, palate and larynx.
Speech motor areas of the brain would have a similar organization to arm motor areas of the brain. The precentral cortex would represent movements (gestures) of the lips, tongue, palate and larynx, and the higher level cortical areas would represent the phonemes to a greater extent.
The precentral cortex represented gestures to a greater extent than phonemes. The inferior frontal cortex, which is a higher level speech area, represented both phonemes and gestures. Researchers recorded brain signals from the cortical surface using electrodes placed in patients undergoing brain surgery to remove brain tumors. The patients had to be awake during their surgery, so researchers asked them to read words from a screen.
After the surgery, scientists marked the times when the patients produced phonemes and gestures. Then they used the recorded brain signals from each cortical area to decode which phonemes and gestures had been produced, and measured the decoding accuracy.
The brain signals in the precentral cortex were more accurate at decoding gestures than phonemes, while those in the inferior frontal cortex were equally good at decoding both phonemes and gestures. This information support linguistic models of speech production. It will also guide engineers in designing brain machine interfaces to decode speech from these brain areas.