Dimensionality Reduction Using an Improved Whale Optimization Algorithm for Data Classification

Full Text (PDF, 918KB), PP.37-49

Views: 0 Downloads: 0

Author(s)

Ah. E. Hegazy 1,* M. A. Makhlouf 1 Gh. S. El-Tawel 1

1. Faculty of Computers & Informatics, Suez Canal University, Ismailia, 41511, Egypt

* Corresponding author.

DOI: https://doi.org/10.5815/ijmecs.2018.07.04

Received: 1 May 2018 / Revised: 26 May 2018 / Accepted: 18 Jun. 2018 / Published: 8 Jul. 2018

Index Terms

Feature Selection, Whale Optimization Algorithm, Bio-inspired Optimization, Classification

Abstract

Whale optimization algorithm is a newly proposed bio-inspired optimization technique introduced in 2016 which imitates the hunting demeanor of hump-back whales. In this paper, to enhance solution accuracy, reliability and convergence speed, we have introduced some modifications on the basic WOA structure. First, a new control parameter, inertia weight, is proposed to tune the impact on the present best solution, and an improved whale optimization algorithm (IWOA) is obtained. Second, we assess IWOA with various transfer functions to convert continuous solutions to binary ones. The pro-posed algorithm incorporated with the K-nearest neighbor classifier as a feature selection method for identifying feature subset that enhancing the classification accuracy and limiting the size of selected features. The proposed algorithm was compared with binary versions of the basic whale optimization algorithm, particle swarm optimization, genetic algorithm, antlion optimizer and grey wolf optimizer on 27 common UCI datasets. Optimization results demonstrate that the proposed IWOA not only significantly enhances the basic whale optimization algorithm but also performs much superior to the other algorithms.

Cite This Paper

Ah. E. Hegazy, M. A. Makhlouf, Gh. S. El-Tawel, " Dimensionality Reduction Using an Improved Whale Optimization Algorithm for Data Classification", International Journal of Modern Education and Computer Science(IJMECS), Vol.10, No.7, pp. 37-49, 2018. DOI:10.5815/ijmecs.2018.07.04

Reference

[1]I. Guyon and A. Elisseeff, “An Introduction to Variable and Feature Selection,” Journal of Machine Learning Research, vol. 3, pp. 1157–1182, 2003.
[2]D. Y. Harvey and M. D. Todd, “Automated Feature Design for Numeric Sequence Classification by Genetic Programming,” IEEE Transactions on Evolutionary Computation, vol. 19, no. 4, pp. 474–489, 2015.
[3]S. Goswami and A. Chakrabarti, “Feature Selection: A Practitioner View,” I.J. Information Technology and Computer Science, MECS, vol. 11, no. 11, pp. 66–77, 2014.
[4]R. O. Duda, P. E. Hart, and D. G. Stork, “Pattern classification,” Wiley-Interscience, p. 680, 2000.
[5]Y. Chen, D. Miao, and R. Wang, “A rough set approach to feature selection based on ant colony optimization,” Pattern Recognition Letters, vol. 31, no. 3, pp. 226–233, 2010.
[6]A. F. Alia and A. Taweel, “Feature Selection based on Hybrid Binary Cuckoo Search and Rough Set Theory in Classification for Nominal Datasets,” I.J. Information Technology and Computer Science, MECS, vol. 4, no. 4, pp. 63–72, 2017.
[7]R. Kohavi and G. H. John, “Wrappers for feature subset selection,” Artificial Intelligence, vol. 97, no. 1-2, pp. 273–324, 1997.
[8]B. Xue, M. Zhang, and W. N. Browne, “Particle swarm optimisation for feature selection in classification: Novel initialisation and updating mechanisms,” Applied Soft Computing, vol. 18, pp. 261–276, 2014.
[9]M. Toghraee, H. Parvin, and F. Rad, “The Impact of Feature Selection on Meta-Heuristic Algorithms to Data Mining Methods,” I.J. Modern Education and Computer Science, MECS, vol. 10, no. 10, pp. 33–39, 2016.
[10]T. Marill and D. Green, “On the effectiveness of receptors in recognition systems,” IEEE Transactions on Information Theory, vol. 9, no. 1, pp. 11–17, 1963.
[11]A. W. Whitney, “A Direct Method of Nonparametric Measurement Selection,” IEEE Transactions on Computers, vol. C-20, no. 9, pp. 1100–1103, 1971.
[12]B. Xue, M. Zhang, and W. N. Browne, “Particle Swarm Optimization for Feature Selection in Classification: A Multi-Objective Approach,” IEEE Transactions on Cybernetics, vol. 43, no. 6, pp. 1656–1671, 2013.
[13]R. Parimala and R. Nallaswamy, “Feature Selection using a Novel Particle Swarm Optimization and It’s Variants,” I.J. Information Technology and Computer Science, MECS, vol. 5, no. 5, pp. 16–24, 2012.
[14]C. L. Huang, “ACO-based hybrid classification system with feature subset selection and model parameters optimization,” Neurocomputing, vol. 73, no. 1-3, pp. 438–448, 2009.
[15]A. E. Eiben, P. E. Raue, and Z. Ruttkay, “Genetic algorithms with multi-parent recombination,” Parallel Problem Solving from NaturePPSN III, International Conference on Evolutionary Computation. Springer, pp. 78–87, 1997.
[16]J. Kennedy and R. Eberhart, “Particle swarm optimization,” IEEE International Conference on Neural Networks, vol. 4, pp. 1942–1948, 1995.
[17]J. H. Holland, “Genetic Algorithms,” Scientific American, vol. 267, no. 1, pp. 66–73, 1992.
[18]S. Mirjalili and A. Lewis, “The Whale Optimization Algorithm,” Advances in Engineering Software, vol. 95, pp. 51–67, 2016.
[19]B. Xue, M. Zhang, W. Browne, and X. Yao, “A Survey on Evolutionary Computation Approaches to Feature Selection,” IEEE Transactions on Evolutionary Computation, vol. 20, no. 4, pp. 606–626, 2016.
[20]C. B, “Genetic algorithm with fuzzy fitness function for feature selection,” IEEE International Symposium on Industrial Electronics, vol. 1, pp. 315–319, 2002.
[21]B. Chakraborty, “Feature subset selection by particle swarm optimization with fuzzy fitness function,” 2008 3rd International Conference on Intelligent System and Knowledge Engineering, pp. 1038–1042, 2008.
[22]K. Neshatian and M. Zhang, “Genetic Programming for Feature Subset Ranking in Binary Classification Problems,” European conference on genetic programming, pp. 121–132, 2009.
[23]C. Yang, L. Chuang, J. Li, and C. Yang, “Chaotic maps in binary particle swarm optimization for feature selection,” 2008 IEEE Conference on Soft Computing in Industrial Applications, pp. 107–112, 2008.
[24]S. Kashef and H. Nezamabadi, “A new feature selection algorithm based on binary ant colony optimization,” The 5th Conference on Information and Knowledge Technology, pp. 50–54, 2013.
[25]E. Emary, W. Yamany, and A. E. Hassanien, “New approach for feature selection based on rough set and bat algorithm,” 2014 9th International Conference on Computer Engineering & Systems (ICCES), pp. 346–353, 2014.
[26]Saroj and Jyoti, “Multi-objective genetic algorithm approach to feature subset optimization,” IEEE International Advance Computing Conference (IACC), pp. 544–548, 2014.
[27]L. A. M. Pereira, D. Rodrigues, T. N. S. Almeida, C. C. O. Ramos, A. N. Souza, X.-S. Yang, and J. P. Papa, “A Binary Cuckoo Search and Its Application for Feature Selection,” Springer International Publishing Switzerland, vol. 516, pp. 141–154, 2014.
[28]E. Emary, H. M. Zawbaa, and A. E. Hassanien, “Binary grey wolf optimization approaches for feature selection,” Neurocomputing, vol. 172, pp. 371–381, 2016.
[29]D. Rodrigues, L. A. Pereira, J. P. Papa, and S. A. Weber, “A Binary Krill Herd Approach for Feature Selection,” 22nd International Conference on Pattern Recognition, pp. 1407–1412, 2014.
[30]R. Y. M. Nakamura, L. A. M. Pereira, K. A. Costa, D. Rodrigues, J. P. Papa, and X. S. Yang, “BBA: A Binary Bat Algorithm for Feature Selection,” 25th SIBGRAPI Conference on Graphics, Patterns and Images, pp. 291–297, 2012.
[31]E. Emary, H. M. Zawbaa, and A. E. Hassanien, “Binary ant lion approaches for feature selection,” Neurocomputing, vol. 213, pp. 54–65, 2016.
[32]“UCI Machine Learning Repository.” [Online]. Available: http://archive.ics.uci.edu/ml/index.php