Emergent multi-agent communication in the deep learning era

A Lazaridou, M Baroni - arXiv preprint arXiv:2006.02419, 2020 - arxiv.org
The ability to cooperate through language is a defining feature of humans. As the
perceptual, motory and planning capabilities of deep artificial networks increase …

What artificial neural networks can tell us about human language acquisition

A Warstadt, SR Bowman - Algebraic structures in natural …, 2022 - taylorfrancis.com
Rapid progress in machine learning for natural language processing has the potential to
transform debates about how humans learn language. However, the learning environments …

Emergent communication at scale

R Chaabouni, F Strub, F Altché, E Tarassov… - International …, 2022 - openreview.net
Emergent communication aims for a better understanding of human language evolution and
building more efficient representations. We posit that reaching these goals will require …

Multi-agent communication meets natural language: Synergies between functional and structural language learning

A Lazaridou, A Potapenko, O Tieleman - arXiv preprint arXiv:2005.07064, 2020 - arxiv.org
We present a method for combining multi-agent communication and traditional data-driven
approaches to natural language learning, with an end goal of teaching agents to …

Anti-efficient encoding in emergent communication

R Chaabouni, E Kharitonov… - Advances in Neural …, 2019 - proceedings.neurips.cc
Despite renewed interest in emergent language simulations with neural networks, little is
known about the basic properties of the induced code, and how they compare to human …

Toward More Human-Like AI Communication: A Review of Emergent Communication Research

N Brandizzi - IEEE Access, 2023 - ieeexplore.ieee.org
In the recent shift towards human-centric AI, the need for machines to accurately use natural
language has become increasingly important. While a common approach to achieve this is …

Analyzing redundancy in pretrained transformer models

F Dalvi, H Sajjad, N Durrani, Y Belinkov - arXiv preprint arXiv:2004.04010, 2020 - arxiv.org
Transformer-based deep NLP models are trained using hundreds of millions of parameters,
limiting their applicability in computationally constrained environments. In this paper, we …

Similarity analysis of contextual word representation models

JM Wu, Y Belinkov, H Sajjad, N Durrani, F Dalvi… - arXiv preprint arXiv …, 2020 - arxiv.org
This paper investigates contextual word representation models from the lens of similarity
analysis. Given a collection of trained models, we measure the similarity of their internal …

Few-shot language coordination by modeling theory of mind

H Zhu, G Neubig, Y Bisk - International Conference on …, 2021 - proceedings.mlr.press
No man is an island. Humans develop the ability to communicate with a large community by
coordinating with different interlocutors within short conversations. This ability is largely …

EGG: a toolkit for research on Emergence of lanGuage in Games

E Kharitonov, R Chaabouni, D Bouchacourt… - arXiv preprint arXiv …, 2019 - arxiv.org
There is renewed interest in simulating language emergence among deep neural agents
that communicate to jointly solve a task, spurred by the practical aim to develop language …