Metric learning by collapsing classes

Amir Globerson*, Sam Roweis

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

521 Scopus citations

Abstract

We present an algorithm for learning a quadratic Gaussian metric (Mahalanobis distance) for use in classification tasks. Our method relies on the simple geometric intuition that a good metric is one under which points in the same class are simultaneously near each other and far from points in the other classes. We construct a convex optimization problem whose solution generates such a metric by trying to collapse all examples in the same class to a single point and push examples in other classes infinitely far away. We show that when the metric we learn is used in simple classifiers, it yields substantial improvements over standard alternatives on a variety of problems. We also discuss how the learned metric may be used to obtain a compact low dimensional feature representation of the original input space, allowing more efficient classification with very little reduction in performance.

Original languageEnglish
Title of host publicationAdvances in Neural Information Processing Systems 18 - Proceedings of the 2005 Conference
Pages451-458
Number of pages8
StatePublished - 2005
Externally publishedYes
Event2005 Annual Conference on Neural Information Processing Systems, NIPS 2005 - Vancouver, BC, Canada
Duration: 5 Dec 20058 Dec 2005

Publication series

NameAdvances in Neural Information Processing Systems
ISSN (Print)1049-5258

Conference

Conference2005 Annual Conference on Neural Information Processing Systems, NIPS 2005
Country/TerritoryCanada
CityVancouver, BC
Period5/12/058/12/05

Fingerprint

Dive into the research topics of 'Metric learning by collapsing classes'. Together they form a unique fingerprint.

Cite this