Shared and modality-specific brain regions that mediate auditory and visual word comprehension

A Keitel, J Gross, C Kayser - Elife, 2020 - elifesciences.org
Visual speech carried by lip movements is an integral part of communication. Yet, it remains
unclear in how far visual and acoustic speech comprehension are mediated by the same …

MEG activity in visual and auditory cortices represents acoustic speech-related information during silent lip reading

F Bröhl, A Keitel, C Kayser - eneuro, 2022 - eneuro.org
Speech is an intrinsically multisensory signal, and seeing the speaker's lips forms a
cornerstone of communication in acoustically impoverished environments. Still, it remains …

[HTML][HTML] A representation of abstract linguistic categories in the visual system underlies successful lipreading

AR Nidiffer, CZ Cao, A O'Sullivan, EC Lalor - NeuroImage, 2023 - Elsevier
There is considerable debate over how visual speech is processed in the absence of sound
and whether neural activity supporting lipreading occurs in visual brain areas. Much of the …

Cortical tracking of continuous speech under bimodal divided attention

Z Xie, C Brodbeck, B Chandrasekaran - Neurobiology of Language, 2023 - direct.mit.edu
Speech processing often occurs amid competing inputs from other modalities, for example,
listening to the radio while driving. We examined the extent to which dividing attention …

Bihemispheric foundations for human speech comprehension

M Bozic, LK Tyler, DT Ives, B Randall… - Proceedings of the …, 2010 - National Acad Sciences
Emerging evidence from neuroimaging and neuropsychology suggests that human speech
comprehension engages two types of neurocognitive processes: a distributed bilateral …

A linguistic representation in the visual system underlies successful lipreading

AR Nidiffer, CZ Cao, A O'Sullivan, EC Lalor - bioRxiv, 2021 - biorxiv.org
There is considerable debate over how visual speech is processed in the absence of sound
and whether neural activity supporting lipreading occurs in visual brain areas. Surprisingly …

Two cortical mechanisms support the integration of visual and auditory speech: A hypothesis and preliminary data

K Okada, G Hickok - Neuroscience letters, 2009 - Elsevier
Visual speech (lip-reading) influences the perception of heard speech. The literature
suggests at least two possible mechanisms for this influence:“direct” sensory–sensory …

Differential auditory and visual phase-locking are observed during audio-visual benefit and silent lip-reading for speech perception

M Aller, HS Økland, LJ MacGregor, H Blank… - Journal of …, 2022 - Soc Neuroscience
Speech perception in noisy environments is enhanced by seeing facial movements of
communication partners. However, the neural mechanisms by which audio and visual …

Decoding the cortical dynamics of sound-meaning mapping

E Kocagoncu, A Clarke, BJ Devereux… - Journal of …, 2017 - Soc Neuroscience
Comprehending speech involves the rapid and optimally efficient mapping from sound to
meaning. Influential cognitive models of spoken word recognition propose that the onset of a …

Inside speech: Multisensory and modality-specific processing of tongue and lip speech actions

A Treille, C Vilain, T Hueber, L Lamalle… - Journal of Cognitive …, 2017 - direct.mit.edu
Action recognition has been found to rely not only on sensory brain areas but also partly on
the observer's motor system. However, whether distinct auditory and visual experiences of …