Gaussian mixture models reduction by variational maximum mutual information

Yossi Bar-Yosef, Yuval Bistritz

Research output: Contribution to journalArticlepeer-review

6 Scopus citations

Abstract

Gaussian mixture models (GMMs) are widely used in a variety of classification tasks where it is often important to approximate high order models by models with fewer components. The paper proposes a novel approach to this problem based on a parametric realization of the maximum mutual information (MMI) criterion and its approximation by a closed-form expression named variational-MMI (VMMI). The maximization of the VMMI can be carried out in an analytically tractable manner and it aims at improving the discrimination ability of the reduced set of models, a goal that was not targeted in previous approaches that simplify each class-related GMM independently. Two effective algorithms are proposed and studied for the optimization of the VMMI criterion. One is a steepest descent type algorithm, and the other, called line search A-functions (LSAF), uses concave associated functions. Experiments held in two speech related tasks, phone recognition and language recognition, demonstrate that the VMMI-based parametric model reduction algorithms significantly outperform previous non-discriminative methods. According to these experiments, the EM-like LSAF-based algorithm requires less iterations and converges to a better value of the objective function compared to the steepest descent algorithm.

Original languageEnglish
Article number7027858
Pages (from-to)1557-1569
Number of pages13
JournalIEEE Transactions on Signal Processing
Volume63
Issue number6
DOIs
StatePublished - 15 Mar 2015

Keywords

  • Continuous-discrete MMI
  • Gaussian mixture models reduction
  • discriminative learning
  • hierarchical clustering

Fingerprint

Dive into the research topics of 'Gaussian mixture models reduction by variational maximum mutual information'. Together they form a unique fingerprint.

Cite this