Syntactic structure from deep learning
Modern deep neural networks achieve impressive performance in engineering applications
that require extensive linguistic skills, such as machine translation. This success has …
that require extensive linguistic skills, such as machine translation. This success has …
Semantic structure in deep learning
E Pavlick - Annual Review of Linguistics, 2022 - annualreviews.org
Deep learning has recently come to dominate computational linguistics, leading to claims of
human-level performance in a range of language processing tasks. Like much previous …
human-level performance in a range of language processing tasks. Like much previous …
[HTML][HTML] Modern language models refute Chomsky's approach to language
ST Piantadosi - From fieldwork to linguistic theory: A tribute to …, 2023 - books.google.com
Modern machine learning has subverted and bypassed the theoretical framework of
Chomsky's generative approach to linguistics, including its core claims to particular insights …
Chomsky's generative approach to linguistics, including its core claims to particular insights …
On the opportunities and risks of foundation models
AI is undergoing a paradigm shift with the rise of models (eg, BERT, DALL-E, GPT-3) that are
trained on broad data at scale and are adaptable to a wide range of downstream tasks. We …
trained on broad data at scale and are adaptable to a wide range of downstream tasks. We …
What BERT is not: Lessons from a new suite of psycholinguistic diagnostics for language models
A Ettinger - Transactions of the Association for Computational …, 2020 - direct.mit.edu
Pre-training by language modeling has become a popular and successful approach to NLP
tasks, but we have yet to understand exactly what linguistic capacities these pre-training …
tasks, but we have yet to understand exactly what linguistic capacities these pre-training …
Large language models demonstrate the potential of statistical learning in language
P Contreras Kallens… - Cognitive …, 2023 - Wiley Online Library
To what degree can language be acquired from linguistic input alone? This question has
vexed scholars for millennia and is still a major focus of debate in the cognitive science of …
vexed scholars for millennia and is still a major focus of debate in the cognitive science of …
Using computational models to test syntactic learnability
We studied the learnability of English filler-gap dependencies and the “island” constraints on
them by assessing the generalizations made by autoregressive (incremental) language …
them by assessing the generalizations made by autoregressive (incremental) language …
A fine-grained comparison of pragmatic language understanding in humans and language models
Pragmatics and non-literal language understanding are essential to human communication,
and present a long-standing challenge for artificial language models. We perform a fine …
and present a long-standing challenge for artificial language models. We perform a fine …
Transformer grammars: Augmenting transformer language models with syntactic inductive biases at scale
Abstract We introduce Transformer Grammars (TGs), a novel class of Transformer language
models that combine (i) the expressive power, scalability, and strong performance of …
models that combine (i) the expressive power, scalability, and strong performance of …
Causal analysis of syntactic agreement mechanisms in neural language models
Targeted syntactic evaluations have demonstrated the ability of language models to perform
subject-verb agreement given difficult contexts. To elucidate the mechanisms by which the …
subject-verb agreement given difficult contexts. To elucidate the mechanisms by which the …