Finite-state Reber automaton and the recurrent neural networks trained in supervised and unsupervised manner

M Cernansky, L Benusková - Lecture Notes in Computer Science (см. в …, 2001 - elibrary.ru
We investigate the evolution of performance of finite-context predictive models built upon the
recurrent activations of the two types of recurrent neural networks (RNNs), which are trained
on strings generated according to the Reber grammar. The first type is a 2nd-order version
of the Elman simple RNN trained to perform the next-symbol prediction in a supervised
manner. The second RNN is an interesting unsupervised alternative, eg the 2nd-order RNN
trained by the Bienenstock, Cooper and Munro (BCM) rule [3]. The BCM learning rule seems …

Finite-State Reber Automaton and the Recurrent Neural Networks Trained in Supervised and Unsupervised Manner

M Cerňanský, L Benuškov - International Conference on Artificial Neural …, 2001 - Springer
We investigate the evolution of performance of finite-context predictive models built upon the
recurrent activations of the two types of recurrent neural networks (RNNs), which are trained
on strings generated according to the Reber grammar. The first type is a 2nd-order version
of the Elman simple RNN trained to perform the next-symbol prediction in a supervised
manner. The second RNN is an interesting unsupervised alternative, eg the 2nd-order RNN
trained by the Bienenstock, Cooper and Munro (BCM) rule [3]. The BCM learning rule seems …
以上显示的是最相近的搜索结果。 查看全部搜索结果