Nothing Special   »   [go: up one dir, main page]


Forward-backward retraining of recurrent neural networks

Part of Advances in Neural Information Processing Systems 8 (NIPS 1995)

Bibtex Metadata Paper

Authors

Andrew W. Senior, Anthony Robinson

Abstract

This paper describes the training of a recurrent neural network as the letter posterior probability estimator for a hidden Markov model, off-line handwriting recognition system. The network esti(cid:173) mates posterior distributions for each of a series of frames repre(cid:173) senting sections of a handwritten word. The supervised training algorithm, backpropagation through time, requires target outputs to be provided for each frame. Three methods for deriving these targets are presented. A novel method based upon the forward(cid:173) backward algorithm is found to result in the recognizer with the lowest error rate.