A new arc algorithm for unconstrained optimization

Research output: Contribution to journalArticlepeer-review


The gradient path of a real valued differentiable function is given by the solution of a system of differential equations. For a quadratic function the above equations are linear, resulting in a closed form solution. A quasi-Newton type algorithm for minimizing an n-dimensional differentiable function is presented. Each stage of the algorithm consists of a search along an arc corresponding to some local quadratic approximation of the function being minimized. The algorithm uses a matrix approximating the Hessian in order to represent the arc. This matrix is updated each stage and is stored in its Cholesky product form. This simplifies the representation of the arc and the updating process. Quadratic termination properties of the algorithm are discussed as well as its global convergence for a general continuously differentiable function. Numerical experiments indicating the efficiency of the algorithm are presented.

Original languageEnglish
Pages (from-to)36-52
Number of pages17
JournalMathematical Programming
Issue number1
StatePublished - Dec 1978


  • Arc Algorithms
  • Gradientpath Algorithms
  • Non-linear Programming
  • Optimization
  • Quasi-Newton Methods
  • Unconstrained Optimization


Dive into the research topics of 'A new arc algorithm for unconstrained optimization'. Together they form a unique fingerprint.

Cite this