Author:
Akinola Olatunji A.,Ezugwu Absalom E.,Oyelade Olaide N.,Agushaka Jeffrey O.
Abstract
AbstractThe dwarf mongoose optimization (DMO) algorithm developed in 2022 was applied to solve continuous mechanical engineering design problems with a considerable balance of the exploration and exploitation phases as a metaheuristic approach. Still, the DMO is restricted in its exploitation phase, somewhat hindering the algorithm's optimal performance. In this paper, we proposed a new hybrid method called the BDMSAO, which combines the binary variants of the DMO (or BDMO) and simulated annealing (SA) algorithm. In the modelling and implementation of the hybrid BDMSAO algorithm, the BDMO is employed and used as the global search method and the simulated annealing (SA) as the local search component to enhance the limited exploitative mechanism of the BDMO. The new hybrid algorithm was evaluated using eighteen (18) UCI machine learning datasets of low and medium dimensions. The BDMSAO was also tested using three high-dimensional medical datasets to assess its robustness. The results showed the efficacy of the BDMSAO in solving challenging feature selection problems on varying datasets dimensions and its outperformance over ten other methods in the study. Specifically, the BDMSAO achieved an overall result of 61.11% in producing the highest classification accuracy possible and getting 100% accuracy on 9 of 18 datasets. It also yielded the maximum accuracy obtainable on the three high-dimensional datasets utilized while achieving competitive performance regarding the number of features selected.
Publisher
Springer Science and Business Media LLC
Reference84 articles.
1. Ahmed, S., Sheikh, K. H., Mirjalili, S. & Sarkar, R. Binary simulated normal distribution optimizer for feature selection: Theory and application in COVID-19 datasets. Expert Syst. Appl. 200, 116834. https://doi.org/10.1016/j.eswa.2022.116834 (2022).
2. Dash, M. & Liu, H. Feature selection for classification. Intell. Data Anal. 1(1), 131–156. https://doi.org/10.1016/S1088-467X(97)00008-5 (1997).
3. Guyon, I. & Elisseeff, A. An introduction to variable and feature selection. J. Mach. Learn. Res. 3, 1157–1182 (2003).
4. He, X., Cai, D. & Niyogi, P. Laplacian score for feature selection. Adv. Neural Inf. Process. Syst. 18, 1 (2005).
5. Liu, H. & Motoda, H. Feature Selection for Knowledge Discovery and Data Mining Vol. 454 (Springer Science & Business Media, 2012).
Cited by
40 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献