A recurrent neural network without chaos
T Laurent, J von Brecht - arXiv preprint arXiv:1612.06212, 2016 - arxiv.org
arXiv preprint arXiv:1612.06212, 2016•arxiv.org
We introduce an exceptionally simple gated recurrent neural network (RNN) that achieves
performance comparable to well-known gated architectures, such as LSTMs and GRUs, on
the word-level language modeling task. We prove that our model has simple, predicable and
non-chaotic dynamics. This stands in stark contrast to more standard gated architectures,
whose underlying dynamical systems exhibit chaotic behavior.
performance comparable to well-known gated architectures, such as LSTMs and GRUs, on
the word-level language modeling task. We prove that our model has simple, predicable and
non-chaotic dynamics. This stands in stark contrast to more standard gated architectures,
whose underlying dynamical systems exhibit chaotic behavior.
We introduce an exceptionally simple gated recurrent neural network (RNN) that achieves performance comparable to well-known gated architectures, such as LSTMs and GRUs, on the word-level language modeling task. We prove that our model has simple, predicable and non-chaotic dynamics. This stands in stark contrast to more standard gated architectures, whose underlying dynamical systems exhibit chaotic behavior.
arxiv.org
以上显示的是最相近的搜索结果。 查看全部搜索结果