The exchange of words, speaking and listening in conversation, may seem unremarkable for most people, but communicating with others is a challenge for people who have aphasia, an impairment of language that often happens after stroke or other brain injury. Aphasia affects about 1 in 250 people, making it more common than Parkinson’s Disease or cerebral palsy, and can make it difficult to return to work and to maintain social relationships. A new study published in the journal Nature Communications provides a detailed brain map of language impairments in aphasia following stroke.
“By studying language in people with aphasia, we can try to accomplish two goals at once: we can improve our clinical understanding of aphasia and get new insights into how language is organized in the mind and brain,” said Daniel Mirman, PhD, an assistant professor in Drexel University’s College of Arts and Sciences who was lead author of the study.
The study is part of a larger multi-site research project funded by grants from the National Institutes of Health and led by senior author Myrna Schwartz, PhD of the Moss Rehabilitation Research Institute. The researchers examined data from 99 people who had persistent language impairments after a left-hemisphere stroke. In the first part of the study, the researchers collected 17 measures of cognitive and language performance and used a statistical technique to find the common elements that underlie performance on multiple measures.
They found that spoken language impairments vary along four dimensions or factors:
- Semantic Recognition: difficulty recognizing the meaning or relationship of concepts, such as matching related pictures or matching words to associated pictures.
- Speech Recognition: difficulty with fine-grained speech perception, such as telling “ba” and “da” apart or determining whether two words rhyme.
- Speech Production: difficulty planning and executing speech actions, such as repeating real and made-up words or the tendency to make speech errors like saying “girappe” for “giraffe.”
- Semantic Errors: making semantic speech errors, such as saying “zebra” instead of “giraffe,” regardless of performance on other tasks that involved processing meaning.
Mapping the Four Factors in the Brain
Next, the researchers determined how individual performance differences for each of these factors were associated with the locations in the brain damaged by stroke. This procedure created a four-factor lesion-symptom map of hotspots the language-specialized left hemisphere where damage from a stroke tended to cause deficits for each specific type of language impairment. One key area was the left Sylvian fissure: speech production and speech recognition were organized as a kind of two-lane, two-way highway around the Sylvian fissure. Damage above the Sylvian fissure, in the parietal and frontal lobes, tended to cause speech production deficits; damage below the Sylvian fissure, in the temporal lobe, tended to cause speech recognition deficits. These results provide new evidence that the cortex around the Sylvian fissure houses separable neural specializations for speech recognition and production.
Semantic errors were most strongly associated with lesions in the left anterior temporal lobe, a location consistent with previous research findings from these researchers and several other research groups. This finding also made an important comparison point for its opposite factor — semantic recognition, which many researchers have argued critically depends on the anterior temporal lobes. Instead, Mirman and colleagues found that semantic recognition deficits were associated with damage to an area they call a “white matter bottleneck” — a region of convergence between multiple tracts of white matter that connect brain regions required for knowing the meanings of words, objects, actions and events.
“Semantic memory almost certainly involves a widely distributed neural system because meaning involves so many different kinds of information,” said Mirman. “We think the white matter bottleneck looks important because it is a point of convergence among multiple pathways in the brain, making this area a vulnerable spot where a small amount of damage can have large functional consequences for semantic processing.”
In a follow-up article soon to be published in the journal Neuropsychologia, Mirman, Schwartz and their colleagues also confirmed these findings with a re-analysis using a new and more sophisticated statistical technique for lesion-symptom mapping.
These studies provide a new perspective on diagnosing different kinds of aphasia, which can have a big impact on how clinicians think about the condition and how they approach developing treatment strategies. The research team at the Moss Rehabilitation Research Institute works closely with its clinical affiliate, the MossRehab Aphasia Center, to develop and test approaches to aphasia rehabilitation that meet the individualized, long-term goals of the patients and are informed by scientific evidence.
According to Schwartz, “A major challenge facing speech-language therapists is the wide diversity of symptoms that one sees in stroke aphasia. With this study, we took a major step towards explaining the symptom diversity in relation to a few primary underlying processes and their mosaic-like representation in the brain. These can serve as targets for new diagnostic assessments and treatment interventions.”
Studying the association between patterns of brain injury and cognitive deficits is a classic approach, with roots in 19th century neurology, at the dawn of cognitive neuroscience. Mirman, Schwartz and their colleagues have scaled up this approach, both in terms of the number of participants and the number of performance measures, and combined it with 21st century brain imaging and statistical techniques. A single study may not be able to fully reveal a system as complex as language and brain, but the more we learn, the closer we get to translating basic cognitive neuroscience into effective rehabilitation strategies.
This story originally appeared at Drexel University.
- Daniel Mirman, Qi Chen, Yongsheng Zhang, Ze Wang, Olufunsho K. Faseyitan, H. Branch Coslett, Myrna F. Schwartz. Neural organization of spoken language revealed by lesion–symptom mapping. Nature Communications, 2015; 6: 6762 DOI: 10.1038/ncomms7762