Invited talk "Contributions of auditory and motor cortices to speech comprehension"

by Anne Keitel,

Centre of Cognitive Neuroimaging (CCNi),

University of Glasgow


will take place on Friday, 6 April 2018 from 11:00 to 12:00 hours in CBBM, Ground Floor, B1/B2.

Host: Prof. Jonas Obleser
Institute of Psychology I
University of Lübeck


How the human brain makes sense of a continuous speech stream is of interest for neuroscience, linguistics, and research on language disorders. Previous work that examined dynamic brain activity has addressed the issue of comprehension only indirectly, for example by contrasting intelligible speech with unintelligible speech. Recent work, however, suggests that brain areas can show similar stimulus-driven activity, but differently contribute to perception or comprehension. In this talk, I will focus on our recent study (Keitel, Gross & Kayser, 2018, PloS Biology), which directly addressed the perceptual relevance of dynamic brain activity for speech encoding, by using a straightforward, single-trial comprehension measure. Furthermore, previous work has been vague regarding the analysed time-scales. We therefore based our analysis directly on the time-scales of phrases, words, syllables, and phonemes of our speech stimuli. By incorporating these two conceptual innovations, we demonstrate that distinct areas of the brain track acoustic information at the time scales of words and phrases. Moreover, our results suggest that the motor cortex uses a cross-frequency coupling mechanism to predict the timing of phrases in ongoing speech. I will also elaborate on some more findings of the motor system’s multifaceted involvement in speech processing. To sum up, our recent findings suggest spatially and temporally distinct brain mechanisms that directly shape our comprehension.

Research Summary

I am interested in intrinsic rhythmic activity in the human brain and how this might help us to understand natural spoken language. I have recently shown that each brain area has its own characteristic mix of intrinsic rhythms (Keitel & Gross, 2016, PloS Biology). On the other hand, speech is also inherently (quasi-) rhythmic, with different rhythms for phonemes, syllables, words, and phrases. How our intrinsic brain rhythms can capitalise on speech rhythms to support speech comprehension is therefore a highly interesting question that can teach us more about mechanistic brain functions. I use magnetoencephalography (MEG) and electroencephalography (EEG), and metrics such as mutual information and phase coherence.