Learning to optimize combinatorial functions

Nir Rosenfeld*, Eric Balkanski, Amir Globerson, Yaron Singer

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

6 Scopus citations

Abstract

Submodular functions have become a ubiquitous tool in machine learning. They are learnable from data, and can be optimized efficiently and with guarantees. Nonetheless, recent negative results show that optimizing learned surrogates of submodular functions can result in arbitrarily bad approximations of the true optimum. Our goal in this paper is to highlight the source of this hardness, and propose an alternative criterion for optimizing general combinatorial functions from sampled data. We prove a tight equivalence showing that a class of functions is optimizable if and only if it can be learned. We provide efficient and scalable optimization algorithms for several function classes of interest, and demonstrate their utility on the task of optimally choosing trending social media items.

Original languageEnglish
Title of host publication35th International Conference on Machine Learning, ICML 2018
EditorsAndreas Krause, Jennifer Dy
PublisherInternational Machine Learning Society (IMLS)
Pages6955-6964
Number of pages10
ISBN (Electronic)9781510867963
StatePublished - 2018
Event35th International Conference on Machine Learning, ICML 2018 - Stockholm, Sweden
Duration: 10 Jul 201815 Jul 2018

Publication series

Name35th International Conference on Machine Learning, ICML 2018
Volume10

Conference

Conference35th International Conference on Machine Learning, ICML 2018
Country/TerritorySweden
CityStockholm
Period10/07/1815/07/18

Funding

FundersFunder number
Brain Science Foundation2014389
National Science FoundationCAREER CCF-1452961

    Fingerprint

    Dive into the research topics of 'Learning to optimize combinatorial functions'. Together they form a unique fingerprint.

    Cite this