Get 20M+ Full-Text Papers For Less Than $1.50/day. Start a 14-Day Trial for You or Your Team.

Learn More →

Exploiting fisher and fukunaga-koontz transforms in chernoff dimensionality reduction

Exploiting fisher and fukunaga-koontz transforms in chernoff dimensionality reduction Exploiting Fisher and Fukunaga-Koontz Transforms in Chernoff Dimensionality Reduction JING PENG, Montclair State University GUNA SEETHARAMAN, Air Force Research Lab WEI FAN, Huawei Noah Ark Lab APARNA VARDE, Montclair State University Knowledge discovery from big data demands effective representation of data. However, big data are often characterized by high dimensionality, which makes knowledge discovery more difficult. Many techniques for dimensionality reudction have been proposed, including well-known Fisher's Linear Discriminant Analysis (LDA). However, the Fisher criterion is incapable of dealing with heteroscedasticity in the data. A technique based on the Chernoff criterion for linear dimensionality reduction has been proposed that is capable of exploiting heteroscedastic information in the data. While the Chernoff criterion has been shown to outperform the Fisher's, a clear understanding of its exact behavior is lacking. In this article, we show precisely what can be expected from the Chernoff criterion. In particular, we show that the Chernoff criterion exploits the Fisher and Fukunaga-Koontz transforms in computing its linear discriminants. Furthermore, we show that a recently proposed decomposition of the data space into four subspaces is incomplete. We provide arguments on how to best enrich the decomposition of the data space in order to account for heteroscedasticity http://www.deepdyve.com/assets/images/DeepDyve-Logo-lg.png ACM Transactions on Knowledge Discovery from Data (TKDD) Association for Computing Machinery

Exploiting fisher and fukunaga-koontz transforms in chernoff dimensionality reduction

Loading next page...
 
/lp/association-for-computing-machinery/exploiting-fisher-and-fukunaga-koontz-transforms-in-chernoff-OO0JVbGGxD
Publisher
Association for Computing Machinery
Copyright
Copyright © 2013 by ACM Inc.
ISSN
1556-4681
DOI
10.1145/2499907.2499911
Publisher site
See Article on Publisher Site

Abstract

Exploiting Fisher and Fukunaga-Koontz Transforms in Chernoff Dimensionality Reduction JING PENG, Montclair State University GUNA SEETHARAMAN, Air Force Research Lab WEI FAN, Huawei Noah Ark Lab APARNA VARDE, Montclair State University Knowledge discovery from big data demands effective representation of data. However, big data are often characterized by high dimensionality, which makes knowledge discovery more difficult. Many techniques for dimensionality reudction have been proposed, including well-known Fisher's Linear Discriminant Analysis (LDA). However, the Fisher criterion is incapable of dealing with heteroscedasticity in the data. A technique based on the Chernoff criterion for linear dimensionality reduction has been proposed that is capable of exploiting heteroscedastic information in the data. While the Chernoff criterion has been shown to outperform the Fisher's, a clear understanding of its exact behavior is lacking. In this article, we show precisely what can be expected from the Chernoff criterion. In particular, we show that the Chernoff criterion exploits the Fisher and Fukunaga-Koontz transforms in computing its linear discriminants. Furthermore, we show that a recently proposed decomposition of the data space into four subspaces is incomplete. We provide arguments on how to best enrich the decomposition of the data space in order to account for heteroscedasticity

Journal

ACM Transactions on Knowledge Discovery from Data (TKDD)Association for Computing Machinery

Published: Jul 1, 2013

There are no references for this article.