作者
Leila Arras, Grégoire Montavon, Klaus-Robert Müller, Wojciech Samek
发表日期
2017/9
研讨会论文
EMNLP'17 Workshop on Computational Approaches to Subjectivity, Sentiment & Social Media Analysis (WASSA)
页码范围
159-168
简介
Recently, a technique called Layer-wise Relevance Propagation (LRP) was shown to deliver insightful explanations in the form of input space relevances for understanding feed-forward neural network classification decisions. In the present work, we extend the usage of LRP to recurrent neural networks. We propose a specific propagation rule applicable to multiplicative connections as they arise in recurrent network architectures such as LSTMs and GRUs. We apply our technique to a word-based bi-directional LSTM model on a five-class sentiment prediction task, and evaluate the resulting LRP relevances both qualitatively and quantitatively, obtaining better results than a gradient-based related method which was used in previous work.
引用总数
20172018201920202021202220232024737607080747833
学术搜索中的文章
L Arras, G Montavon, KR Müller, W Samek - arXiv preprint arXiv:1706.07206, 2017