Author:
Allam Mohan,Malaiyappan Nandhini
Abstract
The performance of the machine learning models mainly relies on the key features available in the training dataset. Feature selection is a significant job for pattern recognition for finding an important group of features to build classification models with a minimum number of features. Feature selection with optimization algorithms will improve the prediction rate of the classification models. But, tuning the controlling parameters of the optimization algorithms is a challenging task. In this paper, we present a wrapper-based model called Feature Selection with Integrative Teaching Learning Based Optimization (FS-ITLBO), which uses multiple teachers to select the optimal set of features from feature space. The goal of the proposed algorithm is to search the entire solution space without struck in the local optima of features. Moreover, the proposed method only utilizes teacher count parameter along with the size of the population and a number of iterations. Various classification models have been used for finding the fitness of instances in the population and to estimate the effectiveness of the proposed model. The robustness of the proposed algorithm has been assessed on Wisconsin Diagnostic Breast Cancer (WDBC) as well as Parkinson’s Disease datasets and compared with different wrapper-based feature selection techniques, including genetic algorithm and Binary Teaching Learning Based Optimization (BTLBO). The outcomes have confirmed that FS-ITLBO model produced the best accuracy with the optimal subset of features
Cited by
6 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献