Group sparse additive models

Junming Yin, Xi Chen, Eric P. Xing

Research output: Chapter in Book/Report/Conference proceedingConference contribution

15 Citations (Scopus)

Abstract

We consider the problem of sparse variable selection in nonparametric additive models, with the prior knowledge of the structure among the covariates to encourage those variables within a group to be selected jointly. Previous works either study the group sparsity in the parametric setting (e.g., group lasso), or address the problem in the nonparametric setting without exploiting the structural information (e.g., sparse additive models). In this paper, we present a new method, called group sparse additive models (GroupSpAM), which can handle group sparsity in additive models. We generalize the ℓ 1/ℓ 2 norm to Hilbert spaces as the sparsity-inducing penalty in GroupSpAM. Moreover, we derive a novel thresholding condition for identifying the functional sparsity at the group level, and propose an efficient block coordinate descent algorithm for constructing the estimate. We demonstrate by simulation that GroupSpAM substantially outperforms the competing methods in terms of support recovery and prediction accuracy in additive models, and also conduct a comparative experiment on a real breast cancer dataset.

Original languageEnglish (US)
Title of host publicationProceedings of the 29th International Conference on Machine Learning, ICML 2012
Pages871-878
Number of pages8
Volume1
StatePublished - 2012
Externally publishedYes
Event29th International Conference on Machine Learning, ICML 2012 - Edinburgh, United Kingdom
Duration: Jun 26 2012Jul 1 2012

Other

Other29th International Conference on Machine Learning, ICML 2012
CountryUnited Kingdom
CityEdinburgh
Period6/26/127/1/12

Fingerprint

Group
Hilbert spaces
penalty
cancer
Recovery
simulation
experiment
knowledge
Experiments

ASJC Scopus subject areas

  • Human-Computer Interaction
  • Education

Cite this

Yin, J., Chen, X., & Xing, E. P. (2012). Group sparse additive models. In Proceedings of the 29th International Conference on Machine Learning, ICML 2012 (Vol. 1, pp. 871-878)

Group sparse additive models. / Yin, Junming; Chen, Xi; Xing, Eric P.

Proceedings of the 29th International Conference on Machine Learning, ICML 2012. Vol. 1 2012. p. 871-878.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Yin, J, Chen, X & Xing, EP 2012, Group sparse additive models. in Proceedings of the 29th International Conference on Machine Learning, ICML 2012. vol. 1, pp. 871-878, 29th International Conference on Machine Learning, ICML 2012, Edinburgh, United Kingdom, 6/26/12.
Yin J, Chen X, Xing EP. Group sparse additive models. In Proceedings of the 29th International Conference on Machine Learning, ICML 2012. Vol. 1. 2012. p. 871-878
Yin, Junming ; Chen, Xi ; Xing, Eric P. / Group sparse additive models. Proceedings of the 29th International Conference on Machine Learning, ICML 2012. Vol. 1 2012. pp. 871-878
@inproceedings{37d4846fdfd04f7e8d57f5988fe74ca9,
title = "Group sparse additive models",
abstract = "We consider the problem of sparse variable selection in nonparametric additive models, with the prior knowledge of the structure among the covariates to encourage those variables within a group to be selected jointly. Previous works either study the group sparsity in the parametric setting (e.g., group lasso), or address the problem in the nonparametric setting without exploiting the structural information (e.g., sparse additive models). In this paper, we present a new method, called group sparse additive models (GroupSpAM), which can handle group sparsity in additive models. We generalize the ℓ 1/ℓ 2 norm to Hilbert spaces as the sparsity-inducing penalty in GroupSpAM. Moreover, we derive a novel thresholding condition for identifying the functional sparsity at the group level, and propose an efficient block coordinate descent algorithm for constructing the estimate. We demonstrate by simulation that GroupSpAM substantially outperforms the competing methods in terms of support recovery and prediction accuracy in additive models, and also conduct a comparative experiment on a real breast cancer dataset.",
author = "Junming Yin and Xi Chen and Xing, {Eric P.}",
year = "2012",
language = "English (US)",
isbn = "9781450312851",
volume = "1",
pages = "871--878",
booktitle = "Proceedings of the 29th International Conference on Machine Learning, ICML 2012",

}

TY - GEN

T1 - Group sparse additive models

AU - Yin, Junming

AU - Chen, Xi

AU - Xing, Eric P.

PY - 2012

Y1 - 2012

N2 - We consider the problem of sparse variable selection in nonparametric additive models, with the prior knowledge of the structure among the covariates to encourage those variables within a group to be selected jointly. Previous works either study the group sparsity in the parametric setting (e.g., group lasso), or address the problem in the nonparametric setting without exploiting the structural information (e.g., sparse additive models). In this paper, we present a new method, called group sparse additive models (GroupSpAM), which can handle group sparsity in additive models. We generalize the ℓ 1/ℓ 2 norm to Hilbert spaces as the sparsity-inducing penalty in GroupSpAM. Moreover, we derive a novel thresholding condition for identifying the functional sparsity at the group level, and propose an efficient block coordinate descent algorithm for constructing the estimate. We demonstrate by simulation that GroupSpAM substantially outperforms the competing methods in terms of support recovery and prediction accuracy in additive models, and also conduct a comparative experiment on a real breast cancer dataset.

AB - We consider the problem of sparse variable selection in nonparametric additive models, with the prior knowledge of the structure among the covariates to encourage those variables within a group to be selected jointly. Previous works either study the group sparsity in the parametric setting (e.g., group lasso), or address the problem in the nonparametric setting without exploiting the structural information (e.g., sparse additive models). In this paper, we present a new method, called group sparse additive models (GroupSpAM), which can handle group sparsity in additive models. We generalize the ℓ 1/ℓ 2 norm to Hilbert spaces as the sparsity-inducing penalty in GroupSpAM. Moreover, we derive a novel thresholding condition for identifying the functional sparsity at the group level, and propose an efficient block coordinate descent algorithm for constructing the estimate. We demonstrate by simulation that GroupSpAM substantially outperforms the competing methods in terms of support recovery and prediction accuracy in additive models, and also conduct a comparative experiment on a real breast cancer dataset.

UR - http://www.scopus.com/inward/record.url?scp=84867131829&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84867131829&partnerID=8YFLogxK

M3 - Conference contribution

SN - 9781450312851

VL - 1

SP - 871

EP - 878

BT - Proceedings of the 29th International Conference on Machine Learning, ICML 2012

ER -