Dissociating language and thought in large language models

K Mahowald, AA Ivanova, IA Blank, N Kanwisher… - Trends in Cognitive …, 2024 - cell.com
Large language models (LLMs) have come closest among all models to date to mastering
human language, yet opinions about their linguistic and cognitive capabilities remain split …

The language network as a natural kind within the broader landscape of the human brain

E Fedorenko, AA Ivanova, TI Regev - Nature Reviews Neuroscience, 2024 - nature.com
Abstract Language behaviour is complex, but neuroscientific evidence disentangles it into
distinct components supported by dedicated brain areas or networks. In this Review, we …

Driving and suppressing the human language network using large language models

G Tuckute, A Sathe, S Srikant, M Taliaferro… - Nature Human …, 2024 - nature.com
Transformer models such as GPT generate human-like language and are predictive of
human brain responses to language. Here, using functional-MRI-measured brain responses …

[HTML][HTML] Prediction during language comprehension: what is next?

R Ryskin, MS Nieuwland - Trends in Cognitive Sciences, 2023 - cell.com
Prediction is often regarded as an integral aspect of incremental language comprehension,
but little is known about the cognitive architectures and mechanisms that support it. We …

Language in brains, minds, and machines

G Tuckute, N Kanwisher… - Annual Review of …, 2024 - annualreviews.org
It has long been argued that only humans could produce and understand language. But
now, for the first time, artificial language models (LMs) achieve this feat. Here we survey the …

The language network is not engaged in object categorization

Y Benn, AA Ivanova, O Clark, Z Mineroff… - Cerebral …, 2023 - academic.oup.com
The relationship between language and thought is the subject of long-standing debate. One
claim states that language facilitates categorization of objects based on a certain feature (eg …

ROSE: A neurocomputational architecture for syntax

E Murphy - Journal of Neurolinguistics, 2024 - Elsevier
A comprehensive neural model of language must accommodate four components:
representations, operations, structures and encoding. Recent intracranial research has …

Convergent representations of computer programs in human and artificial neural networks

S Srikant, B Lipkin, A Ivanova… - Advances in …, 2022 - proceedings.neurips.cc
What aspects of computer programs are represented by the human brain during
comprehension? We leverage brain recordings derived from functional magnetic resonance …

Diverging neural dynamics for syntactic structure building in naturalistic speaking and listening

L Giglio, M Ostarek, D Sharoh… - Proceedings of the …, 2024 - National Acad Sciences
The neural correlates of sentence production are typically studied using task paradigms that
differ considerably from the experience of speaking outside of an experimental setting. In …

[HTML][HTML] Domain-general and language-specific contributions to speech production in a second language: an fMRI study using functional localizers

A Wolna, J Szewczyk, M Diaz, A Domagalik… - Scientific reports, 2024 - nature.com
For bilinguals, speaking in a second language (L2) compared to the native language (L1) is
usually more difficult. In this study we asked whether the difficulty in L2 production reflects …