Access the full text.
Sign up today, get DeepDyve free for 14 days.
Xin-She Yang (2008)
Nature-Inspired Metaheuristic Algorithms
Changhe Li, Shengxiang Yang, Trung-Thanh Nguyen (2012)
A Self-Learning Particle Swarm Optimizer for Global Optimization ProblemsIEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics), 42
A. Stuhlsatz, J. Lippel, Thomas Zielke (2012)
Feature Extraction With Deep Neural Networks by a Generalized Discriminant AnalysisIEEE Transactions on Neural Networks and Learning Systems, 23
K. Bharti, P. Singh (2016)
Opposition chaotic fitness mutation based adaptive inertia weight BPSO for feature selection in text clusteringAppl. Soft Comput., 43
XueYu, ZhongShuiming, Zhuangyi, Xubin (2014)
An ensemble algorithm with self-adaptive learning techniques for high-dimensional numerical optimizationApplied Mathematics and Computation
Uday Kamath, J. Compton, R. Dogan, K. Jong, Amarda Shehu (2012)
An Evolutionary Algorithm Approach for Feature Generation from Sequence Data and Its Application to DNA Splice Site PredictionIEEE/ACM Transactions on Computational Biology and Bioinformatics, 9
A. Qin, V. Huang, P. Suganthan (2009)
Differential Evolution Algorithm With Strategy Adaptation for Global Numerical OptimizationIEEE Transactions on Evolutionary Computation, 13
Yong Zhang, Xianfang Song, D. Gong (2017)
A return-cost-based binary firefly algorithm for feature selectionInf. Sci., 418
K. Yu, X. D. Wu, W. Ding, J. Pei (2016)
Scalable and accurate online feature selection for big dataACM Transactions on Knowledge Discovery from Data, 11
Majdi Mafarja, Nasser Sabar (2018)
Rank based binary particle swarm optimisation for feature selection in classificationProceedings of the 2nd International Conference on Future Networks and Distributed Systems
Yong Zhang, D. Gong, Ying Hu, Wan-qiu Zhang (2015)
Feature selection algorithm based on bare bones particle swarm optimizationNeurocomputing, 148
Kui Yu, Xindong Wu, W. Ding, J. Pei (2014)
Towards Scalable and Accurate Online Feature Selection for Big Data2014 IEEE International Conference on Data Mining
I. Gheyas, L. Smith (2010)
Feature subset selection in large dimensionality domainsPattern Recognit., 43
C. Pornsing, M. Sodhi, B. Lamond (2016)
Novel self-adaptive particle swarm optimization methodsSoft Computing, 20
J. Holland (1992)
Adaptation in Natural and Artificial Systems: An Introductory Analysis with Applications to Biology, Control, and Artificial Intelligence
(2007)
UCI Machine Learning Repository
Jun Li, D. Tao (2012)
On Preserving Original Variables in Bayesian PCA With Application to Image AnalysisIEEE Transactions on Image Processing, 21
P. Pudil, J. Novovicová, Josef Kittler (1994)
Floating search methods in feature selectionPattern Recognit. Lett., 15
Yu Xue, Jiongming Jiang, Binping Zhao, Tinghuai Ma (2017)
A self-adaptive artificial bee colony algorithm based on global best for global optimizationSoft Computing, 22
T. Saaty (1990)
How to Make a Decision: The Analytic Hierarchy ProcessInterfaces, 24
R. Storn, K. Price (1997)
Differential Evolution – A Simple and Efficient Heuristic for global Optimization over Continuous SpacesJournal of Global Optimization, 11
R. Xu, G. Anagnostopoulos, D. Wunsch (2007)
Multiclass Cancer Classification Using Semisupervised Ellipsoid ARTMAP and Particle Swarm Optimization with Gene Expression DataIEEE/ACM Transactions on Computational Biology and Bioinformatics, 4
Bing Xue, Mengjie Zhang, Will Browne (2013)
Particle Swarm Optimization for Feature Selection in Classification: A Multi-Objective ApproachIEEE Transactions on Cybernetics, 43
Yue Wu, S. Hoi, Tao Mei, Nenghai Yu (2014)
Large-Scale Online Feature Selection for Ultra-High Dimensional Sparse DataACM Transactions on Knowledge Discovery from Data (TKDD), 11
Bing Xue, Mengjie Zhang, Will Browne, X. Yao (2016)
A Survey on Evolutionary Computation Approaches to Feature SelectionIEEE Transactions on Evolutionary Computation, 20
T. Sudo, K. Goto, Y. Nojima, H. Ishibuchi (2015)
Effects of ensemble action selection with different usage of player’s memory resource on the evolution of cooperative strategies for iterated prisoner’s dilemma gameIEEE Congress on Evolutionary Computation
J. R. Koza (1990)
Genetic Programming: A Paradigm for Genetically Breeding Populations of Computer Programs to Solve Problems, Vol
B. Xue, M. J. Zhang, W. N. Browne (2014a)
Particle swarm optimisation for feature selection in classification: Novel initialisation and updating mechanismsApplied Soft Computing 18 (2014), 18
Jing Liang, A. Qin, P. Suganthan, S. Baskar (2006)
Comprehensive learning particle swarm optimizer for global optimization of multimodal functionsIEEE Transactions on Evolutionary Computation, 10
Hui Wang, Hui Sun, Changhe Li, S. Rahnamayan, Jeng-Shyang Pan (2013)
Diversity enhanced particle swarm optimization with neighborhood searchInf. Sci., 223
T. Marill, D. Green (1963)
On the effectiveness of receptors in recognition systemsIEEE Trans. Inf. Theory, 9
J. Aguarón, M. Escobar, J. Moreno‐Jiménez (2016)
The precise consistency consensus matrix in a local AHP-group decision making contextAnnals of Operations Research, 245
Jiliang Tang, Huan Liu (2014)
Feature Selection for Social Media DataACM Trans. Knowl. Discov. Data, 8
Kyle Harrison, A. Engelbrecht, B. Ombuki-Berman (2017)
Self-adaptive particle swarm optimization: a review and analysis of convergenceSwarm Intelligence, 12
M. Dorigo, L. Gambardella (1997)
Ant colony system: a cooperative learning approach to the traveling salesman problemIEEE Trans. Evol. Comput., 1
J. Kennedy, R. Eberhart (1995)
Particle swarm optimizationIEEE International Conference on Neural Networks
Yong Zhang, D. Gong, Jian Cheng (2017)
Multi-Objective Particle Swarm Optimization Approach for Cost-Based Feature Selection in ClassificationIEEE/ACM Transactions on Computational Biology and Bioinformatics, 14
Abdullah Ghareb, A. Bakar, A. Hamdan (2016)
Hybrid feature selection based on enhanced genetic algorithm for text categorizationExpert Syst. Appl., 49
A. Al-Ani, A. AlSukker, R. Khushaba (2013)
Feature subset selection using differential evolution and a wheel based search strategySwarm Evol. Comput., 9
Jihoon Yang, Vasant Honavar (1998)
Feature Subset Selection Using a Genetic AlgorithmIEEE Intell. Syst., 13
V. E. Neagoe, E. C. Neghina (2016)
Feature selection with ant colony optimization and its applications for pattern recognition in space imageryInternational Conference on Communications
A. Whitney (1971)
A Direct Method of Nonparametric Measurement SelectionIEEE Transactions on Computers, C-20
Haiqin Yang, Michael Lyu, Irwin King (2013)
Efficient online learning for multitask feature selectionACM Trans. Knowl. Discov. Data, 7
Y. Wang, Jian-zhong Zhou, Youlin Lu, Hui Qin, Yongqiang Wang (2011)
Chaotic self-adaptive particle swarm optimization algorithm for dynamic economic dispatch problem with valve-point effectsExpert Syst. Appl., 38
Yu Wang, Bin Li, T. Weise, Jianyu Wang, Bo Yuan, Qiongjie Tian (2011)
Self-adaptive learning based particle swarm optimizationInf. Sci., 181
J. Holland (1975)
Adaptation in natural and artificial systems
S. D. Stearns (1976)
On selecting features for pattern classifiers3rd International Joint Conference on Pattern Recognition
D. Fogel (1994)
An introduction to simulated evolutionary optimizationIEEE transactions on neural networks, 5 1
Xiaojun Chang, F. Nie, Yi Yang, Heng Huang (2014)
Convex Sparse PCA for Unsupervised Feature LearningACM Transactions on Knowledge Discovery from Data (TKDD), 11
Many evolutionary computation (EC) methods have been used to solve feature selection problems and they perform well on most small-scale feature selection problems. However, as the dimensionality of feature selection problems increases, the solution space increases exponentially. Meanwhile, there are more irrelevant features than relevant features in datasets, which leads to many local optima in the huge solution space. Therefore, the existing EC methods still suffer from the problem of stagnation in local optima on large-scale feature selection problems. Furthermore, large-scale feature selection problems with different datasets may have different properties. Thus, it may be of low performance to solve different large-scale feature selection problems with an existing EC method that has only one candidate solution generation strategy (CSGS). In addition, it is time-consuming to find a suitable EC method and corresponding suitable parameter values for a given large-scale feature selection problem if we want to solve it effectively and efficiently. In this article, we propose a self-adaptive particle swarm optimization (SaPSO) algorithm for feature selection, particularly for large-scale feature selection. First, an encoding scheme for the feature selection problem is employed in the SaPSO. Second, three important issues related to self-adaptive algorithms are investigated. After that, the SaPSO algorithm with a typical self-adaptive mechanism is proposed. The experimental results on 12 datasets show that the solution size obtained by the SaPSO algorithm is smaller than its EC counterparts on all datasets. The SaPSO algorithm performs better than its non-EC and EC counterparts in terms of classification accuracy not only on most training sets but also on most test sets. Furthermore, as the dimensionality of the feature selection problem increases, the advantages of SaPSO become more prominent. This highlights that the SaPSO algorithm is suitable for solving feature selection problems, particularly large-scale feature selection problems.
ACM Transactions on Knowledge Discovery from Data (TKDD) – Association for Computing Machinery
Published: Sep 24, 2019
Keywords: Feature selection
Read and print from thousands of top scholarly journals.
Already have an account? Log in
Bookmark this article. You can see your Bookmarks on your DeepDyve Library.
To save an article, log in first, or sign up for a DeepDyve account if you don’t already have one.
Copy and paste the desired citation format or use the link below to download a file formatted for EndNote
Access the full text.
Sign up today, get DeepDyve free for 14 days.
All DeepDyve websites use cookies to improve your online experience. They were placed on your computer when you launched this website. You can change your cookie settings through your browser.