Sparse and efficient estimation for partial spline models with increasing dimension

Guang Cheng, Hao Zhang, Zuofeng Shang

Research output: Contribution to journalArticle

5 Citations (Scopus)

Abstract

We consider model selection and estimation for partial spline models and propose a new regularization method in the context of smoothing splines. The regularization method has a simple yet elegant form, consisting of roughness penalty on the nonparametric component and shrinkage penalty on the parametric components, which can achieve function smoothing and sparse estimation simultaneously. We establish the convergence rate and oracle properties of the estimator under weak regularity conditions. Remarkably, the estimated parametric components are sparse and efficient, and the nonparametric component can be estimated with the optimal rate. The procedure also has attractive computational properties. Using the representer theory of smoothing splines, we reformulate the objective function as a LASSO-type problem, enabling us to use the LARS algorithm to compute the solution path. We then extend the procedure to situations when the number of predictors increases with the sample size and investigate its asymptotic properties in that context. Finite-sample performance is illustrated by simulations.

Original languageEnglish (US)
Pages (from-to)93-127
Number of pages35
JournalAnnals of the Institute of Statistical Mathematics
Volume67
Issue number1
DOIs
StatePublished - 2013

Fingerprint

Efficient Estimation
Spline
Partial
Smoothing Splines
Regularization Method
Roughness Penalty
Oracle Property
Smoothing Function
Optimal Rates
Shrinkage
Regularity Conditions
Model Selection
Model
Asymptotic Properties
Penalty
Convergence Rate
Predictors
Sample Size
Objective function
Estimator

Keywords

  • High dimensionality
  • Oracle property
  • RKHS
  • Semiparametric models
  • Shrinkage methods
  • Smoothing splines
  • Solution path

ASJC Scopus subject areas

  • Statistics and Probability

Cite this

Sparse and efficient estimation for partial spline models with increasing dimension. / Cheng, Guang; Zhang, Hao; Shang, Zuofeng.

In: Annals of the Institute of Statistical Mathematics, Vol. 67, No. 1, 2013, p. 93-127.

Research output: Contribution to journalArticle

@article{c92a805ccdae4c78bce4f415a2e357c1,
title = "Sparse and efficient estimation for partial spline models with increasing dimension",
abstract = "We consider model selection and estimation for partial spline models and propose a new regularization method in the context of smoothing splines. The regularization method has a simple yet elegant form, consisting of roughness penalty on the nonparametric component and shrinkage penalty on the parametric components, which can achieve function smoothing and sparse estimation simultaneously. We establish the convergence rate and oracle properties of the estimator under weak regularity conditions. Remarkably, the estimated parametric components are sparse and efficient, and the nonparametric component can be estimated with the optimal rate. The procedure also has attractive computational properties. Using the representer theory of smoothing splines, we reformulate the objective function as a LASSO-type problem, enabling us to use the LARS algorithm to compute the solution path. We then extend the procedure to situations when the number of predictors increases with the sample size and investigate its asymptotic properties in that context. Finite-sample performance is illustrated by simulations.",
keywords = "High dimensionality, Oracle property, RKHS, Semiparametric models, Shrinkage methods, Smoothing splines, Solution path",
author = "Guang Cheng and Hao Zhang and Zuofeng Shang",
year = "2013",
doi = "10.1007/s10463-013-0440-y",
language = "English (US)",
volume = "67",
pages = "93--127",
journal = "Annals of the Institute of Statistical Mathematics",
issn = "0020-3157",
publisher = "Springer Netherlands",
number = "1",

}

TY - JOUR

T1 - Sparse and efficient estimation for partial spline models with increasing dimension

AU - Cheng, Guang

AU - Zhang, Hao

AU - Shang, Zuofeng

PY - 2013

Y1 - 2013

N2 - We consider model selection and estimation for partial spline models and propose a new regularization method in the context of smoothing splines. The regularization method has a simple yet elegant form, consisting of roughness penalty on the nonparametric component and shrinkage penalty on the parametric components, which can achieve function smoothing and sparse estimation simultaneously. We establish the convergence rate and oracle properties of the estimator under weak regularity conditions. Remarkably, the estimated parametric components are sparse and efficient, and the nonparametric component can be estimated with the optimal rate. The procedure also has attractive computational properties. Using the representer theory of smoothing splines, we reformulate the objective function as a LASSO-type problem, enabling us to use the LARS algorithm to compute the solution path. We then extend the procedure to situations when the number of predictors increases with the sample size and investigate its asymptotic properties in that context. Finite-sample performance is illustrated by simulations.

AB - We consider model selection and estimation for partial spline models and propose a new regularization method in the context of smoothing splines. The regularization method has a simple yet elegant form, consisting of roughness penalty on the nonparametric component and shrinkage penalty on the parametric components, which can achieve function smoothing and sparse estimation simultaneously. We establish the convergence rate and oracle properties of the estimator under weak regularity conditions. Remarkably, the estimated parametric components are sparse and efficient, and the nonparametric component can be estimated with the optimal rate. The procedure also has attractive computational properties. Using the representer theory of smoothing splines, we reformulate the objective function as a LASSO-type problem, enabling us to use the LARS algorithm to compute the solution path. We then extend the procedure to situations when the number of predictors increases with the sample size and investigate its asymptotic properties in that context. Finite-sample performance is illustrated by simulations.

KW - High dimensionality

KW - Oracle property

KW - RKHS

KW - Semiparametric models

KW - Shrinkage methods

KW - Smoothing splines

KW - Solution path

UR - http://www.scopus.com/inward/record.url?scp=84921068191&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84921068191&partnerID=8YFLogxK

U2 - 10.1007/s10463-013-0440-y

DO - 10.1007/s10463-013-0440-y

M3 - Article

AN - SCOPUS:84921068191

VL - 67

SP - 93

EP - 127

JO - Annals of the Institute of Statistical Mathematics

JF - Annals of the Institute of Statistical Mathematics

SN - 0020-3157

IS - 1

ER -