Get 20M+ Full-Text Papers For Less Than $1.50/day. Start a 14-Day Trial for You or Your Team.

Learn More →

Multilabel dimensionality reduction via dependence maximization

Multilabel dimensionality reduction via dependence maximization Multilabel Dimensionality Reduction via Dependence Maximization YIN ZHANG and ZHI-HUA ZHOU Nanjing University, China Multilabel learning deals with data associated with multiple labels simultaneously. Like other data mining and machine learning tasks, multilabel learning also suffers from the curse of dimensionality. Dimensionality reduction has been studied for many years, however, multilabel dimensionality reduction remains almost untouched. In this article, we propose a multilabel dimensionality reduction method, MDDM, with two kinds of projection strategies, attempting to project the original data into a lower-dimensional feature space maximizing the dependence between the original feature description and the associated class labels. Based on the Hilbert-Schmidt Independence Criterion, we derive a eigen-decomposition problem which enables the dimensionality reduction process to be ef cient. Experiments validate the performance of MDDM. Categories and Subject Descriptors: H.2.8 [Database Management]: Data Mining; I.2.6 [Arti cial Intelligence]: Learning General Terms: Algorithms, Design, Experimentation Additional Key Words and Phrases: Dimensionality reduction, multilabel learning ACM Reference Format: Zhang, Y. and Zhou, Z.-H. 2010. Multilabel dimensionality reduction via dependence maximization. ACM Trans. Knowl. Discov. Data. 4, 3, Article 14 (October 2010), 21 pages. DOI = 10.1145/1839490.1839495 http://doi.acm.org/10.1145/1839490.1839495 1. INTRODUCTION In traditional supervised learning, each instance is associated with one label that http://www.deepdyve.com/assets/images/DeepDyve-Logo-lg.png ACM Transactions on Knowledge Discovery from Data (TKDD) Association for Computing Machinery

Multilabel dimensionality reduction via dependence maximization

Loading next page...
 
/lp/association-for-computing-machinery/multilabel-dimensionality-reduction-via-dependence-maximization-YnNDSbVD0K

References (47)

Publisher
Association for Computing Machinery
Copyright
Copyright © 2010 by ACM Inc.
ISSN
1556-4681
DOI
10.1145/1839490.1839495
Publisher site
See Article on Publisher Site

Abstract

Multilabel Dimensionality Reduction via Dependence Maximization YIN ZHANG and ZHI-HUA ZHOU Nanjing University, China Multilabel learning deals with data associated with multiple labels simultaneously. Like other data mining and machine learning tasks, multilabel learning also suffers from the curse of dimensionality. Dimensionality reduction has been studied for many years, however, multilabel dimensionality reduction remains almost untouched. In this article, we propose a multilabel dimensionality reduction method, MDDM, with two kinds of projection strategies, attempting to project the original data into a lower-dimensional feature space maximizing the dependence between the original feature description and the associated class labels. Based on the Hilbert-Schmidt Independence Criterion, we derive a eigen-decomposition problem which enables the dimensionality reduction process to be ef cient. Experiments validate the performance of MDDM. Categories and Subject Descriptors: H.2.8 [Database Management]: Data Mining; I.2.6 [Arti cial Intelligence]: Learning General Terms: Algorithms, Design, Experimentation Additional Key Words and Phrases: Dimensionality reduction, multilabel learning ACM Reference Format: Zhang, Y. and Zhou, Z.-H. 2010. Multilabel dimensionality reduction via dependence maximization. ACM Trans. Knowl. Discov. Data. 4, 3, Article 14 (October 2010), 21 pages. DOI = 10.1145/1839490.1839495 http://doi.acm.org/10.1145/1839490.1839495 1. INTRODUCTION In traditional supervised learning, each instance is associated with one label that

Journal

ACM Transactions on Knowledge Discovery from Data (TKDD)Association for Computing Machinery

Published: Oct 1, 2010

There are no references for this article.