Shared and modality-specific brain regions that mediate auditory and visual word comprehension
Visual speech carried by lip movements is an integral part of communication. Yet, it remains
unclear in how far visual and acoustic speech comprehension are mediated by the same …
unclear in how far visual and acoustic speech comprehension are mediated by the same …
MEG activity in visual and auditory cortices represents acoustic speech-related information during silent lip reading
Speech is an intrinsically multisensory signal, and seeing the speaker's lips forms a
cornerstone of communication in acoustically impoverished environments. Still, it remains …
cornerstone of communication in acoustically impoverished environments. Still, it remains …
[HTML][HTML] A representation of abstract linguistic categories in the visual system underlies successful lipreading
There is considerable debate over how visual speech is processed in the absence of sound
and whether neural activity supporting lipreading occurs in visual brain areas. Much of the …
and whether neural activity supporting lipreading occurs in visual brain areas. Much of the …
Cortical tracking of continuous speech under bimodal divided attention
Speech processing often occurs amid competing inputs from other modalities, for example,
listening to the radio while driving. We examined the extent to which dividing attention …
listening to the radio while driving. We examined the extent to which dividing attention …
Bihemispheric foundations for human speech comprehension
Emerging evidence from neuroimaging and neuropsychology suggests that human speech
comprehension engages two types of neurocognitive processes: a distributed bilateral …
comprehension engages two types of neurocognitive processes: a distributed bilateral …
A linguistic representation in the visual system underlies successful lipreading
There is considerable debate over how visual speech is processed in the absence of sound
and whether neural activity supporting lipreading occurs in visual brain areas. Surprisingly …
and whether neural activity supporting lipreading occurs in visual brain areas. Surprisingly …
Two cortical mechanisms support the integration of visual and auditory speech: A hypothesis and preliminary data
K Okada, G Hickok - Neuroscience letters, 2009 - Elsevier
Visual speech (lip-reading) influences the perception of heard speech. The literature
suggests at least two possible mechanisms for this influence:“direct” sensory–sensory …
suggests at least two possible mechanisms for this influence:“direct” sensory–sensory …
Differential auditory and visual phase-locking are observed during audio-visual benefit and silent lip-reading for speech perception
Speech perception in noisy environments is enhanced by seeing facial movements of
communication partners. However, the neural mechanisms by which audio and visual …
communication partners. However, the neural mechanisms by which audio and visual …
Decoding the cortical dynamics of sound-meaning mapping
Comprehending speech involves the rapid and optimally efficient mapping from sound to
meaning. Influential cognitive models of spoken word recognition propose that the onset of a …
meaning. Influential cognitive models of spoken word recognition propose that the onset of a …
Inside speech: Multisensory and modality-specific processing of tongue and lip speech actions
Action recognition has been found to rely not only on sensory brain areas but also partly on
the observer's motor system. However, whether distinct auditory and visual experiences of …
the observer's motor system. However, whether distinct auditory and visual experiences of …