Mantis Search Algorithm Integrated with Opposition-Based Learning and Simulated Annealing for Feature Selection
Main Article Content
Abstract
Feature selection (FS) plays a vital role in minimizing the high-dimensional data as much as possible to aid in enhancing the classification accuracy and reducing computational costs. The purpose of the FS techniques is to extract the most effective subset features, which might enable the machine learning (ML) algorithms to better grasp the input data’s patterns and improve their classification performance. Although several metaheuristic algorithms have been recently presented to solve this problem, they still suffer from several disadvantages, such as getting stuck in local optima, slow convergence speed, and a lack of population diversity, which prevent them from achieving the desired solutions in an acceptable time. Therefore, this study is presented to propose a new feature selection approach, namely OBMSASA, based on integrating the recently published mantis search algorithm with the opposition-based learning (OBL) method and simulated annealing (SA) to strengthen its exploration and exploitation operators. The OBL method aims to improve the exploration operator, making the algorithm able to avoid stagnation into local minima; meanwhile, the SA is used as a local search to further strengthen the exploration operator, thereby improving the convergence speed. The K-nearest neighbor algorithm is used to compute the accuracy of the selected feature. The proposed algorithm is assessed using 21 common datasets and compared to several rival optimizers in terms of several performance metrics, including convergence curve, average fitness, computational cost, length of selected features, and standard deviation, to observe its effectiveness and efficiency. The source code is publicly accessible at https://drive.mathworks.com/OBMSASA.
Downloads
Article Details
This work is licensed under a Creative Commons Attribution 4.0 International License.