TY - GEN
T1 - Semi-cyclic stochastic gradient descent
AU - Eichner, Hubert
AU - Koren, Tomer
AU - McMahan, H. Brendan
AU - Srebro, Nathan
AU - Talwar, Kunal
N1 - Publisher Copyright:
Copyright 2019 by the author(s).
PY - 2019
Y1 - 2019
N2 - We consider convex SGD updates with a block-cyclic structure, i.e., where each cycle consists of a small number of blocks, each with many samples from a possibly different, block-specific, distribution. This situation arises, e.g., in Federated Learning where the mobile devices available for updates at different times during the day have different characteristics. Wc show that such block-cyclic structure can significantly deteriorate the performance of SGD, but propose a simple approach that allows prediction with the same guarantees as for i.i.d., non-cyclic, sampling.
AB - We consider convex SGD updates with a block-cyclic structure, i.e., where each cycle consists of a small number of blocks, each with many samples from a possibly different, block-specific, distribution. This situation arises, e.g., in Federated Learning where the mobile devices available for updates at different times during the day have different characteristics. Wc show that such block-cyclic structure can significantly deteriorate the performance of SGD, but propose a simple approach that allows prediction with the same guarantees as for i.i.d., non-cyclic, sampling.
UR - http://www.scopus.com/inward/record.url?scp=85079446364&partnerID=8YFLogxK
M3 - ???researchoutput.researchoutputtypes.contributiontobookanthology.conference???
AN - SCOPUS:85079446364
T3 - 36th International Conference on Machine Learning, ICML 2019
SP - 3165
EP - 3177
BT - 36th International Conference on Machine Learning, ICML 2019
PB - International Machine Learning Society (IMLS)
T2 - 36th International Conference on Machine Learning, ICML 2019
Y2 - 9 June 2019 through 15 June 2019
ER -