First-order methods in optimization

Research output: Book/ReportBookpeer-review

Abstract

The primary goal of this book is to provide a self-contained, comprehensive study of the main first-order methods that are frequently used in solving large-scale problems. First-order methods exploit information on values and gradients/subgradients (but not Hessians) of the functions composing the model under consideration. With the increase in the number of applications that can be modeled as large or even huge-scale optimization problems, there has been a revived interest in using simple methods that require low iteration cost as well as low memory storage. The author has gathered, reorganized, and synthesized (in a unified manner) many results that are currently scattered throughout the literature, many of which cannot be typically found in optimization books. First-Order Methods in Optimization offers comprehensive study of first-order methods with the theoretical foundations; provides plentiful examples and illustrations; emphasizes rates of convergence and complexity analysis of the main first-order methods used to solve large-scale problems; and covers both variables and functional decomposition methods
Original languageEnglish
Place of PublicationPhiladelphia
PublisherSociety for Industrial and Applied Mathematics (SIAM)
ISBN (Print)9781611974980, 9781611974997, 1611974992
DOIs
StatePublished - 2017

Publication series

NameMOS-SIAM series on optimization
Volume25

Keywords

  • Scientific computing
  • Nonlinear optimization
  • Decomposition methods
  • First order methods
  • Mathematical optimization
  • Convex analysis
  • Convergence

Fingerprint

Dive into the research topics of 'First-order methods in optimization'. Together they form a unique fingerprint.

Cite this