Recurrent neural network language model training with noise contrastive estimation for speech recognition
2015 IEEE International Conference on Acoustics, Speech and Signal …, 2015•ieeexplore.ieee.org
In recent years recurrent neural network language models (RNNLMs) have been
successfully applied to a range of tasks including speech recognition. However, an
important issue that limits the quantity of data used, and their possible application areas, is
the computational cost in training. A signi?? cant part of this cost is associated with the
softmax function at the output layer, as this requires a normalization term to be explicitly
calculated. This impacts both the training and testing speed, especially when a large output …
successfully applied to a range of tasks including speech recognition. However, an
important issue that limits the quantity of data used, and their possible application areas, is
the computational cost in training. A signi?? cant part of this cost is associated with the
softmax function at the output layer, as this requires a normalization term to be explicitly
calculated. This impacts both the training and testing speed, especially when a large output …
In recent years recurrent neural network language models (RNNLMs) have been successfully applied to a range of tasks including speech recognition. However, an important issue that limits the quantity of data used, and their possible application areas, is the computational cost in training. A signi??cant part of this cost is associated with the softmax function at the output layer, as this requires a normalization term to be explicitly calculated. This impacts both the training and testing speed, especially when a large output vocabulary is used. To address this problem, noise contrastive estimation (NCE) is explored in RNNLM training. NCE does not require the above normalization during both training and testing. It is insensitive to the output layer size. On a large vocabulary conversational telephone speech recognition task, a doubling in training speed on a GPU and a 56 times speed up in test time evaluation on a CPU were obtained.
ieeexplore.ieee.org
以上显示的是最相近的搜索结果。 查看全部搜索结果