Model selection in regression under structural constraints

Felix Abramovich*, Vadim Grinshtein

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

Abstract

The paper considers model selection in regression under the additional structural constraints on admissible models where the number of potential predictors miht be even larger than the available sample size. We develop a Bayesian formalism which is used as a natural tool for generating a wide class of model selection criteria based on penalized least squares estimation with various complexity penalties associated with a prior on a model size. The resulting criteria are adaptive to structural constraints. We establish the upper bound for the quadratic risk of the resulting MAP estimator and the corresponding lower bound for the minimax risk over a set of admissible models of a given size. We then specify the class of priors (and, therefore, the class of complexity penalties) where for the "nearly-orthogonal" design the MAP estimator is asymptotically at least nearly-minimax (up to a log-factor) simultaneously over an entire range of sparse and dense setups. Moreover, when the numbers of admissible models are "small" (e.g., ordered variable selection) or, on the opposite, for the case of complete variable selection, the proposed estimator achieves the exact minimax rates.

Original languageEnglish
Pages (from-to)480-498
Number of pages19
JournalElectronic Journal of Statistics
Volume7
Issue number1
DOIs
StatePublished - 2013

Keywords

  • Adaptivity
  • Complexity penalty
  • Gaussian linear regression
  • Maximum a posteriori rule
  • Minimaxity
  • Model selection
  • Sparsity
  • Structural constraints

Fingerprint

Dive into the research topics of 'Model selection in regression under structural constraints'. Together they form a unique fingerprint.

Cite this