TY - JOUR
T1 - Linear regression with Gaussian model uncertainty
T2 - Algorithms and bounds
AU - Wiesel, Ami
AU - Eldar, Yonina C.
AU - Yeredor, Arie
N1 - Funding Information:
Manuscript received March 8, 2007; revised November 1, 2007. The associate editor coordinating the review of this manuscript and approving it for publication was Dr. Antonia Papandreou-Suppappola. The work of A. Wiesel and Y. C. Eldar was supported by the European Union 6th framework program via the NEWCOM and NEWCOM++ networks of excellence and by the Israel Science Foundation. Some of the results in this paper were presented at the 2006 IEEE International Conference on Acoustics, Speech and Signal Processing.
PY - 2008/6
Y1 - 2008/6
N2 - In this paper, we consider the problem of estimating an unknown deterministic parameter vector in a linear regression model with random Gaussian uncertainty in the mixing matrix. We prove that the maximum-likelihood (ML) estimator is a (de)regularized least squares estimator and develop three alternative approaches for finding the regularization parameter that maximizes the likelihood. We analyze the performance using the Cramér-Rao bound (CRB) on the mean squared error, and show that the degradation in performance due the uncertainty is not as severe as may be expected. Next, we address the problem again assuming that the variances of the noise and the elements in the model matrix are unknown and derive the associated CRB and ML estimator. We compare our methods to known results on linear regression in the error in variables (EIV) model. We discuss the similarity between these two competing approaches, and provide a thorough comparison that sheds light on their theoretical and practical differences.
AB - In this paper, we consider the problem of estimating an unknown deterministic parameter vector in a linear regression model with random Gaussian uncertainty in the mixing matrix. We prove that the maximum-likelihood (ML) estimator is a (de)regularized least squares estimator and develop three alternative approaches for finding the regularization parameter that maximizes the likelihood. We analyze the performance using the Cramér-Rao bound (CRB) on the mean squared error, and show that the degradation in performance due the uncertainty is not as severe as may be expected. Next, we address the problem again assuming that the variances of the noise and the elements in the model matrix are unknown and derive the associated CRB and ML estimator. We compare our methods to known results on linear regression in the error in variables (EIV) model. We discuss the similarity between these two competing approaches, and provide a thorough comparison that sheds light on their theoretical and practical differences.
KW - Errors in variables (EIV)
KW - Linear models
KW - Maximum-likelihood (ML) estimation
KW - Random model matrix
KW - Total least squares
UR - http://www.scopus.com/inward/record.url?scp=44849089298&partnerID=8YFLogxK
U2 - 10.1109/TSP.2007.914323
DO - 10.1109/TSP.2007.914323
M3 - ???researchoutput.researchoutputtypes.contributiontojournal.article???
AN - SCOPUS:44849089298
SN - 1053-587X
VL - 56
SP - 2194
EP - 2205
JO - IEEE Transactions on Signal Processing
JF - IEEE Transactions on Signal Processing
IS - 6
ER -