Chen, X., Liu, X., Ragni, A. et al. (2 more authors) (2018) Future word contexts in neural network language models. In: 2017 IEEE Automatic Speech Recognition and Understanding Workshop (ASRU). IEEE Automatic Speech Recognition and Understanding Workshop (ASRU), 16-20 Dec 2017, Okinawa, Japan. IEEE , pp. 97-103. ISBN 9781509047895
Abstract
Recently, bidirectional recurrent network language models (bi-RNNLMs) have been shown to outperform standard, unidirectional, recurrent neural network language models (uni-RNNLMs) on a range of speech recognition tasks. This indicates that future word context information beyond the word history can be useful. However, bi-RNNLMs pose a number of challenges as they make use of the complete previous and future word context information. This impacts both training efficiency and their use within a lattice rescoring framework. In this paper these issues are addressed by proposing a novel neural network structure, succeeding word RNNLMs (suRNNLMs). Instead of using a recurrent unit to capture the complete future word contexts, a feedforward unit is used to model a finite number of succeeding, future, words. This model can be trained much more efficiently than bi-RNNLMs and can also be used for lattice rescoring. Experimental results on a meeting transcription task (AMI) show the proposed model consistently outperformed uni-RNNLMs and yield only a slight degradation compared to bi-RNNLMs in N-best rescoring. Additionally, performance improvements can be obtained using lattice rescoring and subsequent confusion network decoding.
Metadata
Authors/Creators: |
|
---|---|
Copyright, Publisher and Additional Information: | © 2017 IEEE. |
Keywords: | Bidirectional recurrent neural network; language model; succeeding words; speech recognition |
Dates: |
|
Institution: | The University of Sheffield |
Academic Units: | The University of Sheffield > Faculty of Engineering (Sheffield) > Department of Computer Science (Sheffield) |
Depositing User: | Symplectic Sheffield |
Date Deposited: | 25 Nov 2019 11:44 |
Last Modified: | 25 Nov 2019 11:44 |
Status: | Published |
Publisher: | IEEE |
Refereed: | Yes |
Identification Number: | https://doi.org/10.1109/ASRU.2017.8268922 |
Related URLs: |