SEBOOST - Boosting stochastic learning using subspace optimization techniques

Elad Richardson, Rom Herskovitz, Boris Ginsburg, Michael Zibulevsky

Research output: Contribution to journalConference articlepeer-review

Abstract

We present SEBOOST, a technique for boosting the performance of existing stochastic optimization methods. SEBOOST applies a secondary optimization process in the subspace spanned by the last steps and descent directions. The method was inspired by the SESOP optimization method, and has been adapted for the stochastic learning. It can be applied on top of any existing optimization method with no need to tweak the internal algorithm. We show that the method is able to boost the performance of different algorithms, and make them more robust to changes in their hyper-parameters. As the boosting steps of SEBOOST are applied between large sets of descent steps, the additional subspace optimization hardly increases the overall computational burden. We introduce hyper-parameters that control the balance between the baseline method and the secondary optimization process. The method was evaluated on several deep learning tasks, demonstrating significant improvement in performance.

Original languageEnglish
Pages (from-to)1542-1550
Number of pages9
JournalAdvances in Neural Information Processing Systems
StatePublished - 2016
Externally publishedYes
Event30th Annual Conference on Neural Information Processing Systems, NIPS 2016 - Barcelona, Spain
Duration: 5 Dec 201610 Dec 2016

Fingerprint

Dive into the research topics of 'SEBOOST - Boosting stochastic learning using subspace optimization techniques'. Together they form a unique fingerprint.

Cite this