Simplifying mixture models using the unscented transform

Jacob Goldberger*, Hayit K. Greenspan, Jeremie Dreyfuss

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

35 Scopus citations

Abstract

Mixture of Gaussians (MoG) model is a useful tool in statistical learning. In many learning processes that are based on mixture models, computational requirements are very demanding due to the large number of components involved in the model. We propose a novel algorithm for learning a simplified representation of a Gaussian mixture, that is based on the Unscented Transform which was introduced for filtering nonlinear dynamical systems. The superiority of the proposed method is validated on both simulation experiments and categorization of a real image database. The proposed categorization methodology is based on modeling each image using a Gaussian mixture model. A category model is obtained by learning a simplified mixture model from all the images in the category.

Original languageEnglish
Pages (from-to)1496-1502
Number of pages7
JournalIEEE Transactions on Pattern Analysis and Machine Intelligence
Volume30
Issue number8
DOIs
StatePublished - Aug 2008

Keywords

  • Clustering
  • Mixture of Gaussians
  • Reduced model
  • Unscented Transform
  • Weighted likelihood

Fingerprint

Dive into the research topics of 'Simplifying mixture models using the unscented transform'. Together they form a unique fingerprint.

Cite this