Binary JAYA Algorithm with Adaptive Mutation for Feature Selection
Abstract
In this paper, a new metaheuristic algorithm called JAYA algorithm has been adapted for feature selection. Feature selection is
a typical problem in machine learning and data mining domain concerned with determining the subset of high discriminative
features from the irrelevant, noisy, redundant, and high-dimensional features. JAYA algorithm is initially proposed for continuous optimization. Due to the binary nature of the feature selection problem, the JAYA algorithm is adjusted using sinusoidal
(i.e., S-shape) transfer function. Furthermore, the mutation operator controlled by adaptive mutation rate (Rm) parameter is
also utilized to control the diversity during the search. The proposed binary JAYA algorithm with adaptive mutation is called
BJAM algorithm. The performance of BJAM algorithm is tested using 22 real-world benchmark datasets, which vary in
terms of the number of features and the number of instances. Four measures are used for performance analysis: classification
accuracy, number of features, fitness values, and computational times. Initially, a comparison between binary JAYA (BJA)
algorithm and the proposed BJAM algorithm is conducted to show the effect of the mutation operator in the convergence
behavior. After that, the results produced by the BJAM algorithm are compared against those yielded by ten state-of-the-art
methods. Surprisingly, the proposed BJAM algorithm is able to excel other comparative methods in 7 out of 22 datasets in
terms of classification accuracy. This can lead to the conclusion that the proposed BJAM algorithm is an efficient algorithm
for the problems belonging to the feature selection domain and is pregnant with fruitful results.
Volume
45Issue
12Collections
The following license files are associated with this item: