Variance regularization of RNNLM for speech recognition
2014 IEEE International Conference on Acoustics, Speech and Signal …, 2014•ieeexplore.ieee.org
Recurrent neural network language models (RNNLMs) have been proved superior to many
other competitive language modeling techniques in terms of perplexity and word error rate.
The remaining problem is the great computational complexity of RNNLMs in the output layer,
resulting in long time for evaluation. Typically, a class-based RNNLM with the output layer
factorized was proposed for speedup, which was still not fast enough for real-time systems.
In this paper, a novel variance regularization algorithm is proposed for RNNLMs to address …
other competitive language modeling techniques in terms of perplexity and word error rate.
The remaining problem is the great computational complexity of RNNLMs in the output layer,
resulting in long time for evaluation. Typically, a class-based RNNLM with the output layer
factorized was proposed for speedup, which was still not fast enough for real-time systems.
In this paper, a novel variance regularization algorithm is proposed for RNNLMs to address …
Recurrent neural network language models (RNNLMs) have been proved superior to many other competitive language modeling techniques in terms of perplexity and word error rate. The remaining problem is the great computational complexity of RNNLMs in the output layer, resulting in long time for evaluation. Typically, a class-based RNNLM with the output layer factorized was proposed for speedup, which was still not fast enough for real-time systems. In this paper, a novel variance regularization algorithm is proposed for RNNLMs to address this problem. All the softmax-normalizing factors in the output layers are penalized to make them converge to one during the training phase, so that the output probability can be estimated efficiently via one dot-product of vectors in the output layer. The computational complexity of the output layer is reduced significantly from O(|V|H) to O(H). We further use this model for rescoring in an advanced CD-HMM-DNN system. Experimental results show that our proposed variance regularization algorithm works quite well, and the word prediction of the model is about 300 times faster than that of RNNLM without any obvious deteriorations in word error rate.
ieeexplore.ieee.org
以上显示的是最相近的搜索结果。 查看全部搜索结果