Abstract
This article proposes the multiclass proximal support vector machine (MPSVM) classifier, which extends the binary PSVM to the multiclass case. Unlike the one-versus-rest approach that constructs the decision rule based on multiple binary classification tasks, the proposed method considers all classes simultaneously and has better theoretical properties and empirical performance. We formulate the MPSVM as a regularization problem in the reproducing kernel Hilbert space and show that it implements the Bayes rule for classification. In addition, the MPSVM can handle equal and unequal misclassification costs in a unified framework. We suggest an efficient algorithm to implement the MPSVM by solving a system of linear equations. This algorithm requires much less computational effort than solving the standard SVM, which often requires quadratic programming and can be slow for large problems. We also provide an alternative and more robust algorithm for ill-posed problems. The effectiveness of the MPSVM is demonstrated by both simulation studies and applications to cancer classifications using microarray data.
Original language | English (US) |
---|---|
Pages (from-to) | 339-355 |
Number of pages | 17 |
Journal | Journal of Computational and Graphical Statistics |
Volume | 15 |
Issue number | 2 |
DOIs | |
State | Published - Jun 2006 |
Externally published | Yes |
Keywords
- Bayes rule
- Nonstandard classifications
- Reproducing kernel Hilbert space
ASJC Scopus subject areas
- Statistics and Probability
- Discrete Mathematics and Combinatorics
- Statistics, Probability and Uncertainty