A neural model for compositional word embeddings and sentence processing

JP Bernardy, S Lappin - 2022 - qmro.qmul.ac.uk
We propose a new neural model for word embeddings, which uses Unitary Matrices as the
primary device for encoding lexical information. It uses simple matrix multiplication to derive …

[图书][B] Evaluating transformer's ability to learn mildly context-sensitive languages

S Wang - 2021 - search.proquest.com
Transformer models perform well on NLP tasks, but recent theoretical studies suggest their
ability in modeling certain regular and context-free languages are limited. This creates a …

Unitary Recurrent Networks: Algebraic and Linear Structures for Syntax

JP Bernardy, S Lappin - Algebraic Structures in Natural Language, 2022 - taylorfrancis.com
The emergence of powerful deep learning systems has largely displaced classical symbolic
algebraic models of linguistic representation in computational linguistics. While deep neural …

Assessing the unitary rnn as an end-to-end compositional model of syntax

JP Bernardy, S Lappin - arXiv preprint arXiv:2208.05719, 2022 - arxiv.org
We show that both an LSTM and a unitary-evolution recurrent neural network (URN) can
achieve encouraging accuracy on two types of syntactic patterns: context-free long distance …

[图书][B] Algebraic Structures in Natural Language

S Lappin, JP Bernardy - 2022 - api.taylorfrancis.com
Algebraic Structures in Natural Language Page 1 Page 2 Algebraic Structures in Natural
Language Algebraic Structures in Natural Language addresses a central problem in cognitive …

On how transformers learn to understand and evaluate nested arithmetic expressions

D Grashoff - 2022 - studenttheses.uu.nl
In this thesis, we studied whether self-attention networks can learn compositional seman-tics
using an arithmetic language. The goal of language aims to evaluate the meaning of nested …