Analyzing and interpreting neural networks for NLP: A report on the first BlackboxNLP workshop

A Alishahi, G Chrupała, T Linzen - Natural Language Engineering, 2019 - cambridge.org
The Empirical Methods in Natural Language Processing (EMNLP) 2018 workshop
BlackboxNLP was dedicated to resources and techniques specifically developed for …

MMT: cross domain few-shot learning via meta-memory transfer

W Wang, L Duan, Y Wang, J Fan… - IEEE Transactions on …, 2023 - ieeexplore.ieee.org
Few-shot learning aims to recognize novel categories solely relying on a few labeled
samples, with existing few-shot methods primarily focusing on the categories sampled from …

State gradients for analyzing memory in LSTM language models

L Verwimp, P Wambacq - Computer Speech & Language, 2020 - Elsevier
Gradients can be used to train neural networks, but they can also be used to interpret them.
We investigate how well the inputs of RNNs are remembered by their state by calculating …

Text generation and enhanced evaluation of metric for machine translation

SS Amin, L Ragha - … Intelligence and Cognitive Informatics: Proceedings of …, 2021 - Springer
Here the power of a recurrent neural network (RNN) has been exhibited for generating
grammatically correct new text from given input text and translation of the new text to the …