Get 20M+ Full-Text Papers For Less Than $1.50/day. Start a 14-Day Trial for You or Your Team.

Learn More →

Statistical Properties and Adaptive Tuning of Support Vector Machines

Statistical Properties and Adaptive Tuning of Support Vector Machines In this paper we consider the statistical aspects of support vector machines (SVMs) in the classification context, and describe an approach to adaptively tuning the smoothing parameter(s) in the SVMs. The relation between the Bayes rule of classification and the SVMs is discussed, shedding light on why the SVMs work well. This relation also reveals that the misclassification rate of the SVMs is closely related to the generalized comparative Kullback-Leibler distance (GCKL) proposed in Wahba (1999, Scholkopf, Burges, & Smola (Eds.), Advances in Kernel Methods—Support Vector Learning. Cambridge, MA: MIT Press). The adaptive tuning is based on the generalized approximate cross validation (GACV), which is an easily computable proxy of the GCKL. The results are generalized to the unbalanced case where the fraction of members of the classes in the training set is different than that in the general population, and the costs of misclassification for the two kinds of errors are different. The main results in this paper have been obtained in several places elsewhere. Here we take the opportunity to organize them in one place and note how they fit together and reinforce one another. Mostly the work of the authors is reviewed. http://www.deepdyve.com/assets/images/DeepDyve-Logo-lg.png Machine Learning Springer Journals

Statistical Properties and Adaptive Tuning of Support Vector Machines

Machine Learning , Volume 48 (3) – Oct 4, 2004

Loading next page...
 
/lp/springer-journals/statistical-properties-and-adaptive-tuning-of-support-vector-machines-5APdbMTpbH

References (17)

Publisher
Springer Journals
Copyright
Copyright © 2002 by Kluwer Academic Publishers
Subject
Computer Science; Artificial Intelligence (incl. Robotics); Control, Robotics, Mechatronics; Computing Methodologies; Simulation and Modeling; Language Translation and Linguistics
ISSN
0885-6125
eISSN
1573-0565
DOI
10.1023/A:1013951620650
Publisher site
See Article on Publisher Site

Abstract

In this paper we consider the statistical aspects of support vector machines (SVMs) in the classification context, and describe an approach to adaptively tuning the smoothing parameter(s) in the SVMs. The relation between the Bayes rule of classification and the SVMs is discussed, shedding light on why the SVMs work well. This relation also reveals that the misclassification rate of the SVMs is closely related to the generalized comparative Kullback-Leibler distance (GCKL) proposed in Wahba (1999, Scholkopf, Burges, & Smola (Eds.), Advances in Kernel Methods—Support Vector Learning. Cambridge, MA: MIT Press). The adaptive tuning is based on the generalized approximate cross validation (GACV), which is an easily computable proxy of the GCKL. The results are generalized to the unbalanced case where the fraction of members of the classes in the training set is different than that in the general population, and the costs of misclassification for the two kinds of errors are different. The main results in this paper have been obtained in several places elsewhere. Here we take the opportunity to organize them in one place and note how they fit together and reinforce one another. Mostly the work of the authors is reviewed.

Journal

Machine LearningSpringer Journals

Published: Oct 4, 2004

There are no references for this article.