Abstract
<p>The latest advances in engineering, science, and technology have contributed to an enormous generation of datasets. This vast dataset contains irrelevant, redundant, and noisy features that adversely impact classification performance in data mining and machine learning (ML) techniques. Feature selection (FS) is a preprocessing stage to minimize the data dimensionality by choosing the most prominent feature while improving the classification performance. Since the size data produced are often extensive in dimension, this enhances the complexity of search space, where the maximal number of potential solutions is 2nd for n feature datasets. As n becomes large, it becomes computationally impossible to compute the feature. Therefore, there is a need for effective FS techniques for large-scale problems of classification. Many metaheuristic approaches were utilized for FS to resolve the challenges of heuristic-based approaches. Recently, the swarm algorithm has been suggested and demonstrated to perform effectively for FS tasks. Therefore, I developed a Hybrid Mutated Tunicate Swarm Algorithm for FS and Global Optimization (HMTSA-FSGO) technique. The proposed HMTSA-FSGO model mainly aims to eradicate unwanted features and choose the relevant ones that highly impact the classifier results. In the HMTSA-FSGO model, the HMTSA is derived by integrating the standard TSA with two concepts: A dynamic s-best mutation operator for an optimal trade-off between exploration and exploitation and a directional mutation rule for enhanced search space exploration. The HMTSA-FSGO model also includes a bidirectional long short-term memory (BiLSTM) classifier to examine the impact of the FS process. The rat swarm optimizer (RSO) model can choose the hyperparameters to boost the BiLSTM network performance. The simulation analysis of the HMTSA-FSGO technique is tested using a series of experiments. The investigational validation of the HMTSA-FSGO technique showed a superior outcome of 93.01%, 97.39%, 61.59%, 99.15%, and 67.81% over diverse datasets.</p>
Publisher
American Institute of Mathematical Sciences (AIMS)