Abstract
Generalized additive models belong to modern techniques from statistical learning, and are applicable in many areas of prediction, e.g. in financial mathematics, computational biology, medicine, chemistry and environmental protection. In these models, the expectation of response is linked to the predictors via a link function. These models are fitted through local scoring algorithm using a scatterplot smoother as building blocks proposed by Hastie and Tibshirani (1987). In this article, we first give a short introduction and review. Then, we present a mathematical modeling by splines based on a new clustering approach for the x, their density, and the variation of output y. We contribute to regression with generalized additive models by bounding (penalizing) second-order terms (curvature) of the splines, leading to a more robust approximation. Previously, in [23], we proposed a refining modification and investigation of the backfitting algorithm, applied to additive models. Then, because of drawbacks of the modified backfitting algorithm, we solve this problem using continuous optimization techniques, which will become an important complementary technology and alternative to the concept of modified backfitting algorithm. In particular, we model and treat the constrained residual sum of squares by the elegant framework of conic quadratic programming.
Original language | English |
---|---|
Pages (from-to) | 675-698 |
Number of pages | 24 |
Journal | Optimization |
Volume | 56 |
Issue number | 5-6 |
DOIs | |
State | Published - Oct 2007 |
Externally published | Yes |
Keywords
- Backfitting (Gauss-Seidel) algorithm
- Classification
- Clustering
- Conic quadratic programming
- Continuous optimization
- Curvature
- Density
- Generalized additive model
- Penalty methods
- Regression
- Separation of variables
- Statistical learning
- Variation