Guaranteed Convergence of the Hough Transform

Menashe Soffer, Nahum Kiryati

Research output: Contribution to journalArticlepeer-review


The straight-line Hough Transform using normal parameterization with a continuous voting kernel is considered. It transforms the collinearity detection problem to a problem of finding the global maximum of a two dimensional function above a domain in the parameter space. The principle is similar to robust regression using fixed scale M-estimation. Unlike standard M-estimation procedures the Hough Transform does not rely on a good initial estimate of the line parameters: The global optimization problem is approached by exhaustive search on a grid that is usually as fine as computationally feasible. The global maximum of a general function above a bounded domain cannot be found by a finite number of function evaluations. Only if sufficient a priori knowledge about the smoothness of the objective function is available can convergence to the global maximum be guaranteed. The extraction of a priori information and its efficient use are the main challenges in real global optimization problems. Convergence in the Hough Transform is the ability to ensure that the global maximum is in the immediate neighborhood of the maximal grid point. More than 30 years after Hough patented the basic algorithm, it is still not clear how fine the parameter space quantization should be in order not to miss the true maximum. In this paper conditions for the convergence of the Hough Transform to the global maximum are derived. The necessary constraints on the variability of the objective (Hough) function are obtained by using the saturated parabolic voting kernel and by defining an image model with several application dependent parameters. Random errors in the location of edge points and background noise are allowed in the model and lead to statistical convergence guarantees. Significant intermediate results are obtained on the structure of the peak region and on the spatial statistics of noise voting in the continuous kernel Hough Transform. Convergence strategies are studied and the necessary parameter space quantization intervals are derived. Guaranteed focusing policies for multiresolution Hough algorithms are developed. Application of the theoretic results to images that deviate from the image model is considered and exemplified.

Original languageEnglish
Pages (from-to)119-134
Number of pages16
JournalComputer Vision and Image Understanding
Issue number2
StatePublished - Feb 1998
Externally publishedYes


  • Computer vision
  • Covering methods
  • Global optimization
  • Hough transform
  • M-estimation
  • Pattern recognition
  • Robust regression


Dive into the research topics of 'Guaranteed Convergence of the Hough Transform'. Together they form a unique fingerprint.

Cite this