APSIPA Transactions on Signal and Information Processing > Vol 3 > Issue 1

Joint optimization on decoding graphs using minimum classification error criterion

Abdelaziz A. Abdelhamid, Ain Shams University, Egypt, abdelaziz.ieee@live.com , Waleed H. Abdulla, Auckland University, New Zealand
 
Suggested Citation
Abdelaziz A. Abdelhamid and Waleed H. Abdulla (2014), "Joint optimization on decoding graphs using minimum classification error criterion", APSIPA Transactions on Signal and Information Processing: Vol. 3: No. 1, e6. http://dx.doi.org/10.1017/ATSIP.2014.5

Publication Date: 29 Apr 2014
© 2014 Abdelaziz A. Abdelhamid and Waleed H. Abdulla
 
Subjects
 
Keywords
Speech recognitionWeighted finite-state transducersDiscriminative trainingAcoustic modelsLanguage models
 

Share

Open Access

This is published under the terms of the Creative Commons Attribution licence.

Downloaded: 1028 times

In this article:
I. INTRODUCTION 
II. MCE-BASED DISCRIMINATIVE TRAINING FRAMEWORK 
III. OPTIMIZATION METHOD 
IV. EXPERIMENTS 
V. SUMMARY AND DISCUSSION 
VI. CONCLUSION 

Abstract

Motivated by the inherent correlation between the speech features and their lexical words, we propose in this paper a new framework for learning the parameters of the corresponding acoustic and language models jointly. The proposed framework is based on discriminative training of the models' parameters using minimum classification error criterion. To verify the effectiveness of the proposed framework, a set of four large decoding graphs is constructed using weighted finite-state transducers as a composition of two sets of context-dependent acoustic models and two sets of n-gram-based language models. The experimental results conducted on this set of decoding graphs validated the effectiveness of the proposed framework when compared with four baseline systems based on maximum likelihood estimation and separate discriminative training of acoustic and language models in benchmark testing of two speech corpora, namely TIMIT and RM1.

DOI:10.1017/ATSIP.2014.5