Private stochastic convex optimization: Optimal rates in linear time

Vitaly Feldman, Tomer Koren, Kunal Talwar

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

113 Scopus citations

Abstract

We study differentially private (DP) algorithms for stochastic convex optimization: the problem of minimizing the population loss given i.i.d. samples from a distribution over convex loss functions. A recent work of Bassily et al. (2019) has established the optimal bound on the excess population loss achievable given n samples. Unfortunately, their algorithm achieving this bound is relatively inefficient: it requires O(min{n3/2, n5/2/d}) gradient computations, where d is the dimension of the optimization problem. We describe two new techniques for deriving DP convex optimization algorithms both achieving the optimal bound on excess loss and using O(min{n, n2/d}) gradient computations. In particular, the algorithms match the running time of the optimal non-private algorithms. The first approach relies on the use of variable batch sizes and is analyzed using the privacy amplification by iteration technique of Feldman et al. (2018). The second approach is based on a general reduction to the problem of localizing an approximately optimal solution with differential privacy. Such localization, in turn, can be achieved using existing (non-private) uniformly stable optimization algorithms. As in the earlier work, our algorithms require a mild smoothness assumption. We also give a linear-time optimal algorithm for the strongly convex case, as well as a faster algorithm for the non-smooth case.

Original languageEnglish
Title of host publicationSTOC 2020 - Proceedings of the 52nd Annual ACM SIGACT Symposium on Theory of Computing
EditorsKonstantin Makarychev, Yury Makarychev, Madhur Tulsiani, Gautam Kamath, Julia Chuzhoy
PublisherAssociation for Computing Machinery
Pages439-449
Number of pages11
ISBN (Electronic)9781450369794
DOIs
StatePublished - 8 Jun 2020
Event52nd Annual ACM SIGACT Symposium on Theory of Computing, STOC 2020 - Chicago, United States
Duration: 22 Jun 202026 Jun 2020

Publication series

NameProceedings of the Annual ACM Symposium on Theory of Computing
ISSN (Print)0737-8017

Conference

Conference52nd Annual ACM SIGACT Symposium on Theory of Computing, STOC 2020
Country/TerritoryUnited States
CityChicago
Period22/06/2026/06/20

Keywords

  • Differential Privacy
  • Stochastic Convex Optimization
  • Stochastic Gradient Descent

Fingerprint

Dive into the research topics of 'Private stochastic convex optimization: Optimal rates in linear time'. Together they form a unique fingerprint.

Cite this