Connectionist temporal classification

A Graves, A Graves - … sequence labelling with recurrent neural networks, 2012 - Springer
A Graves, A Graves
Supervised sequence labelling with recurrent neural networks, 2012Springer
This chapter introduces the connectionist temporal classification (CTC) output layer for
recurrent neural networks (Graves et al., 2006). As its name suggests, CTC was specifically
designed for temporal classification tasks; that is, for sequence labelling problems where the
alignment between the inputs and the target labels is unknown. Unlike the hybrid approach
described in the previous chapter, CTC models all aspects of the sequence with a single
neural network, and does not require the network to be combined with a hidden Markov …
Abstract
This chapter introduces the connectionist temporal classification (CTC) output layer for recurrent neural networks (Graves et al., 2006). As its name suggests, CTC was specifically designed for temporal classification tasks; that is, for sequence labelling problems where the alignment between the inputs and the target labels is unknown. Unlike the hybrid approach described in the previous chapter, CTC models all aspects of the sequence with a single neural network, and does not require the network to be combined with a hidden Markov model. It also does not require presegmented training data, or external postprocessing to extract the label sequence from the network outputs. Experiments on speech and handwriting recognition show that a BLSTM network with a CTC output layer is an effective sequence labeller, generally outperforming standardHMMsandHMM-neural network hybrids, as well asmore recent sequence labelling algorithms such as large margin HMMs (Sha and Saul, 2006) and conditional random fields (Lafferty et al., 2001).
Springer
以上显示的是最相近的搜索结果。 查看全部搜索结果