TY - JOUR
T1 - SEBOOST - Boosting stochastic learning using subspace optimization techniques
AU - Richardson, Elad
AU - Herskovitz, Rom
AU - Ginsburg, Boris
AU - Zibulevsky, Michael
N1 - Publisher Copyright:
© 2016 NIPS Foundation - All Rights Reserved.
PY - 2016
Y1 - 2016
N2 - We present SEBOOST, a technique for boosting the performance of existing stochastic optimization methods. SEBOOST applies a secondary optimization process in the subspace spanned by the last steps and descent directions. The method was inspired by the SESOP optimization method, and has been adapted for the stochastic learning. It can be applied on top of any existing optimization method with no need to tweak the internal algorithm. We show that the method is able to boost the performance of different algorithms, and make them more robust to changes in their hyper-parameters. As the boosting steps of SEBOOST are applied between large sets of descent steps, the additional subspace optimization hardly increases the overall computational burden. We introduce hyper-parameters that control the balance between the baseline method and the secondary optimization process. The method was evaluated on several deep learning tasks, demonstrating significant improvement in performance.
AB - We present SEBOOST, a technique for boosting the performance of existing stochastic optimization methods. SEBOOST applies a secondary optimization process in the subspace spanned by the last steps and descent directions. The method was inspired by the SESOP optimization method, and has been adapted for the stochastic learning. It can be applied on top of any existing optimization method with no need to tweak the internal algorithm. We show that the method is able to boost the performance of different algorithms, and make them more robust to changes in their hyper-parameters. As the boosting steps of SEBOOST are applied between large sets of descent steps, the additional subspace optimization hardly increases the overall computational burden. We introduce hyper-parameters that control the balance between the baseline method and the secondary optimization process. The method was evaluated on several deep learning tasks, demonstrating significant improvement in performance.
UR - http://www.scopus.com/inward/record.url?scp=85018919977&partnerID=8YFLogxK
M3 - ???researchoutput.researchoutputtypes.contributiontojournal.conferencearticle???
AN - SCOPUS:85018919977
SN - 1049-5258
SP - 1542
EP - 1550
JO - Advances in Neural Information Processing Systems
JF - Advances in Neural Information Processing Systems
T2 - 30th Annual Conference on Neural Information Processing Systems, NIPS 2016
Y2 - 5 December 2016 through 10 December 2016
ER -