Map model selection in gaussian regression

Felix Abramovich, Vadim Grinshtein

Research output: Contribution to journalArticlepeer-review

Abstract

We consider a Bayesian approach to model selection in Gaussian linear regression, where the number of predictors might be much larger than the number of observations. From a frequentist view, the proposed procedure results in the penalized least squares estimation with a complexity penalty associated with a prior on the model size. We investigate the optimality properties of the resulting model selector. We establish the oracle inequality and specify conditions on the prior that imply its asymptotic minimaxity within a wide range of sparse and dense settings for “nearly-orthogonal” and “multicollinear” designs.

Original languageEnglish
Pages (from-to)932-949
Number of pages18
JournalElectronic Journal of Statistics
Volume4
DOIs
StatePublished - 2010

Keywords

  • Adaptivity
  • Complexity penalty
  • Gaussian linear regression
  • Maximum a posteriori rule
  • Minimax estimation
  • Model selection
  • Oracle inequality
  • Sparsity

Fingerprint

Dive into the research topics of 'Map model selection in gaussian regression'. Together they form a unique fingerprint.

Cite this