Convex learning with invariances

Choon Hui Teo*, Amir Globerson, Sam Roweis, Alexander J. Smola

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

Incorporating invariances into a learning algorithm is a common problem in machine learning. We provide a convex formulation which can deal with arbitrary loss functions and arbitrary losses. In addition, it is a drop-in replacement for most optimization algorithms for kernels, including solvers of the SVMStruct family. The advantage of our setting is that it relies on column generation instead of modifying the underlying optimization problem directly.

Original languageEnglish
Title of host publicationAdvances in Neural Information Processing Systems 20 - Proceedings of the 2007 Conference
PublisherCurran Associates Inc.
ISBN (Print)160560352X, 9781605603520
StatePublished - 2008
Externally publishedYes
Event21st Annual Conference on Neural Information Processing Systems, NIPS 2007 - Vancouver, BC, Canada
Duration: 3 Dec 20076 Dec 2007

Publication series

NameAdvances in Neural Information Processing Systems 20 - Proceedings of the 2007 Conference

Conference

Conference21st Annual Conference on Neural Information Processing Systems, NIPS 2007
Country/TerritoryCanada
CityVancouver, BC
Period3/12/076/12/07

Fingerprint

Dive into the research topics of 'Convex learning with invariances'. Together they form a unique fingerprint.

Cite this