Get 20M+ Full-Text Papers For Less Than $1.50/day. Start a 14-Day Trial for You or Your Team.

Learn More →

Random Projections for Linear Support Vector Machines

Random Projections for Linear Support Vector Machines Random Projections for Linear Support Vector Machines SAURABH PAUL, Rensselaer Polytechnic Institute CHRISTOS BOUTSIDIS, Yahoo! Labs, New York, NY MALIK MAGDON-ISMAIL and PETROS DRINEAS, Rensselaer Polytechnic Institute Let X be a data matrix of rank , whose rows represent n points in d-dimensional space. The linear support vector machine constructs a hyperplane separator that maximizes the 1-norm soft margin. We develop a new oblivious dimension reduction technique that is precomputed and can be applied to any input matrix X. We prove that, with high probability, the margin and minimum enclosing ball in the feature space are preserved to within -relative error, ensuring comparable generalization as in the original space in the case of classification. For regression, we show that the margin is preserved to -relative error with high probability. We present extensive experiments with real and synthetic data to support our theory. Categories and Subject Descriptors: I.5.2 [Design Methodology]: Classifier Design and Evaluation; Feature Evaluation and Selection; G.1.6 [Optimization]: Quadratic Programming Models; G.1.0 [General]: Numerical Algorithms General Terms: Algorithms, Experimentation, Theory Additional Key Words and Phrases: Classification, dimensionality reduction, support vector machines ACM Reference Format: Saurabh Paul, Christos Boutsidis, Malik Magdon-Ismail, and Petros Drineas. 2014. Random projections for http://www.deepdyve.com/assets/images/DeepDyve-Logo-lg.png ACM Transactions on Knowledge Discovery from Data (TKDD) Association for Computing Machinery

Loading next page...
 
/lp/association-for-computing-machinery/random-projections-for-linear-support-vector-machines-Drjgrg6Cob
Publisher
Association for Computing Machinery
Copyright
Copyright © 2014 by ACM Inc.
ISSN
1556-4681
DOI
10.1145/2641760
Publisher site
See Article on Publisher Site

Abstract

Random Projections for Linear Support Vector Machines SAURABH PAUL, Rensselaer Polytechnic Institute CHRISTOS BOUTSIDIS, Yahoo! Labs, New York, NY MALIK MAGDON-ISMAIL and PETROS DRINEAS, Rensselaer Polytechnic Institute Let X be a data matrix of rank , whose rows represent n points in d-dimensional space. The linear support vector machine constructs a hyperplane separator that maximizes the 1-norm soft margin. We develop a new oblivious dimension reduction technique that is precomputed and can be applied to any input matrix X. We prove that, with high probability, the margin and minimum enclosing ball in the feature space are preserved to within -relative error, ensuring comparable generalization as in the original space in the case of classification. For regression, we show that the margin is preserved to -relative error with high probability. We present extensive experiments with real and synthetic data to support our theory. Categories and Subject Descriptors: I.5.2 [Design Methodology]: Classifier Design and Evaluation; Feature Evaluation and Selection; G.1.6 [Optimization]: Quadratic Programming Models; G.1.0 [General]: Numerical Algorithms General Terms: Algorithms, Experimentation, Theory Additional Key Words and Phrases: Classification, dimensionality reduction, support vector machines ACM Reference Format: Saurabh Paul, Christos Boutsidis, Malik Magdon-Ismail, and Petros Drineas. 2014. Random projections for

Journal

ACM Transactions on Knowledge Discovery from Data (TKDD)Association for Computing Machinery

Published: Aug 1, 2014

There are no references for this article.