### Abstract

We examine Support Vector Machines from the point of view of solutions to variational problems in a reproducing kernel Hilbert space. We discuss the Generalized Comparative Kullback-Leibler Distance as a target for choosing tuning parameters in SVM's, and we propose that the Generalized Approximate Cross Validation estimate of them is a reasonable proxy for this target. We indicate an interesting relationship between the GACV and the SVM margin.

Original language | English (US) |
---|---|

Pages | 12-20 |

Number of pages | 9 |

State | Published - Dec 1 1999 |

Externally published | Yes |

Event | Proceedings of the 1999 9th IEEE Workshop on Neural Networks for Signal Processing (NNSP'99) - Madison, WI, USA Duration: Aug 23 1999 → Aug 25 1999 |

### Conference

Conference | Proceedings of the 1999 9th IEEE Workshop on Neural Networks for Signal Processing (NNSP'99) |
---|---|

City | Madison, WI, USA |

Period | 8/23/99 → 8/25/99 |

### Fingerprint

### ASJC Scopus subject areas

- Signal Processing
- Software
- Electrical and Electronic Engineering

### Cite this

*Margin-like quantities and generalized approximate cross validation for support vector machines*. 12-20. Paper presented at Proceedings of the 1999 9th IEEE Workshop on Neural Networks for Signal Processing (NNSP'99), Madison, WI, USA, .

**Margin-like quantities and generalized approximate cross validation for support vector machines.** / Wahba, Grace; Lin, Yi; Zhang, Hao.

Research output: Contribution to conference › Paper

}

TY - CONF

T1 - Margin-like quantities and generalized approximate cross validation for support vector machines

AU - Wahba, Grace

AU - Lin, Yi

AU - Zhang, Hao

PY - 1999/12/1

Y1 - 1999/12/1

N2 - We examine Support Vector Machines from the point of view of solutions to variational problems in a reproducing kernel Hilbert space. We discuss the Generalized Comparative Kullback-Leibler Distance as a target for choosing tuning parameters in SVM's, and we propose that the Generalized Approximate Cross Validation estimate of them is a reasonable proxy for this target. We indicate an interesting relationship between the GACV and the SVM margin.

AB - We examine Support Vector Machines from the point of view of solutions to variational problems in a reproducing kernel Hilbert space. We discuss the Generalized Comparative Kullback-Leibler Distance as a target for choosing tuning parameters in SVM's, and we propose that the Generalized Approximate Cross Validation estimate of them is a reasonable proxy for this target. We indicate an interesting relationship between the GACV and the SVM margin.

UR - http://www.scopus.com/inward/record.url?scp=0033350009&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=0033350009&partnerID=8YFLogxK

M3 - Paper

AN - SCOPUS:0033350009

SP - 12

EP - 20

ER -