Locally Optimal Descent for Dynamic Stepsize Scheduling

Gilad Yehudai, Alon Cohen, Amit Daniely, Yoel Drori, Tomer Koren, Mariano Schain

Research output: Contribution to journalConference articlepeer-review

Abstract

We introduce a novel dynamic learning-rate scheduling scheme grounded in theory with the goal of simplifying the manual and time-consuming tuning of schedules in practice. Our approach is based on estimating the locally-optimal stepsize, guaranteeing maximal descent in the direction of the stochastic gradient of the current step. We first establish theoretical convergence bounds for our method within the context of smooth non-convex stochastic optimization. We then present a practical implementation of our algorithm and conduct systematic experiments across diverse datasets and optimization algorithms, comparing our scheme with existing state-of-the-art learning-rate schedulers. Our findings indicate that our method needs minimal tuning when compared to existing approaches. Thus, removing the need for auxiliary manual schedules and warmup phases and achieving comparable performance with drastically reduced parameter tuning.

Original languageEnglish
Pages (from-to)1099-1107
Number of pages9
JournalProceedings of Machine Learning Research
Volume258
StatePublished - 2025
Event28th International Conference on Artificial Intelligence and Statistics, AISTATS 2025 - Mai Khao, Thailand
Duration: 3 May 20255 May 2025

Fingerprint

Dive into the research topics of 'Locally Optimal Descent for Dynamic Stepsize Scheduling'. Together they form a unique fingerprint.

Cite this