TY - JOUR
T1 - Regularization in regression with bounded noise
T2 - A Chebyshev center approach
AU - Beck, Amir
AU - Eldar, Yonina C.
PY - 2007
Y1 - 2007
N2 - We consider the problem of estimating a vector z in the regression model b = Az+w, where w is an unknown but bounded noise. As in many regularization schemes, we assume that an upper bound on the norm of z is available. To estimate z we propose a relaxation of the Chebyshev center, which is the vector that minimizes the worst-case estimation error over all feasible vectors z. Relying on recent results regarding strong duality of noncouvex quadratic optimization problems with two quadratic constraints, we prove that in the complex domain our approach leads to the exact Chebyshev center. In the real domain, this strategy results in a "pretty good" approximation of the true Chebyshev center. As we show, our estimate can be viewed as a Tikhonov regularization with a special choice of parameter that can be found efficiently by solving a convex optimization problem with two variables or a semidefinite program with three variables, regardless of the problem size. When the norm constraint on z is a Euclidean one, the problem reduces to a single-variable convex minimization problem. We then demonstrate via numerical examples that our estimator can outperform other conventional methods, such as least-squares and regularized least-squares, with respect to the estimation error. Finally, we extend our methodology to other feasible parameter sets, showing that the total least-squares (TLS) and regularized TLS can be obtained as special cases of our general approach.
AB - We consider the problem of estimating a vector z in the regression model b = Az+w, where w is an unknown but bounded noise. As in many regularization schemes, we assume that an upper bound on the norm of z is available. To estimate z we propose a relaxation of the Chebyshev center, which is the vector that minimizes the worst-case estimation error over all feasible vectors z. Relying on recent results regarding strong duality of noncouvex quadratic optimization problems with two quadratic constraints, we prove that in the complex domain our approach leads to the exact Chebyshev center. In the real domain, this strategy results in a "pretty good" approximation of the true Chebyshev center. As we show, our estimate can be viewed as a Tikhonov regularization with a special choice of parameter that can be found efficiently by solving a convex optimization problem with two variables or a semidefinite program with three variables, regardless of the problem size. When the norm constraint on z is a Euclidean one, the problem reduces to a single-variable convex minimization problem. We then demonstrate via numerical examples that our estimator can outperform other conventional methods, such as least-squares and regularized least-squares, with respect to the estimation error. Finally, we extend our methodology to other feasible parameter sets, showing that the total least-squares (TLS) and regularized TLS can be obtained as special cases of our general approach.
KW - Bounded error estimation
KW - Chebyshev center
KW - Strong duality
KW - Uoncouvex quadratic optimization
UR - http://www.scopus.com/inward/record.url?scp=41849124828&partnerID=8YFLogxK
U2 - 10.1137/060656784
DO - 10.1137/060656784
M3 - ???researchoutput.researchoutputtypes.contributiontojournal.article???
AN - SCOPUS:41849124828
SN - 0895-4798
VL - 29
SP - 606
EP - 625
JO - SIAM Journal on Matrix Analysis and Applications
JF - SIAM Journal on Matrix Analysis and Applications
IS - 2
ER -