Affiliation:
1. Department of Computer Science, Jaypee Institute of Information Technology, Noida, India
Abstract
Background:
Feature selection sometimes also known as attribute subset selection is a process in
which optimal subset of features are elected with respect to target data by reducing dimensionality and removing
irrelevant features. There will be 2n possible solutions for a dataset having n number of features that
is difficult to solve by conventional attribute selection method. In such cases metaheuristic-based methods
generally outruns the conventional methods.
Objective:
The main aim of this paper is to enhance the classification accuracy and minimize the number of
selected features and error rate.
Methods:
To achieve the objective, a binary metaheuristic feature selection method bGWOSA based on
grey wolf optimization and simulated annealing has been introduced. The proposed feature selection method
uses simulated annealing for equalizing the trade-off between exploration and exploitation. The performance
of the proposed binary feature selection method has been examined on the ten feature selection
benchmark datasets taken from UCI repository and compared with binary cuckoo search, binary particle
swarm optimization, binary grey wolf optimization, binary bat algorithm and binary hybrid whale optimization
method.
Results:
The proposed feature selection method achieves the highest accuracy for the most of datasets compared
to state-of-the-art. Further, from the experimental and statistical results, efficacy of the proposed feature
selection method has been validated.
Conclusion:
Classification accuracy can be enhanced by employing feature selection methods. Moreover,
performance can also be enhanced by tuning the control parameters of metaheuristic methods.
Publisher
Bentham Science Publishers Ltd.
Cited by
9 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献