Exploring the Effect of Imaging Techniques Extension to PSO on Neural Networks

Full Text (PDF, 618KB), PP.9-18

Views: 0 Downloads: 0

Author(s)

Anes A. Abbas 1,* Nabil M. Hewahi 1

1. Department of Computer Science, University of Bahrain

* Corresponding author.

DOI: https://doi.org/10.5815/ijigsp.2020.02.02

Received: 30 Aug. 2019 / Revised: 16 Oct. 2019 / Accepted: 17 Nov. 2019 / Published: 8 Apr. 2020

Index Terms

Search Space Imaging, Metaheuristics, Optimization, Particle Swarm Optimization, Artificial Neural Networks, Population Initialization.

Abstract

In this paper we go through some very recent imaging techniques that are inspired from space exploration. The advantages of these techniques are to help in searching space.  To explore the effectiveness of these imaging techniques on search spaces, we consider the Particle Swarm Optimization algorithm and extend it using the imaging techniques to train multiple neural networks using several datasets for the purpose of classification. The techniques were used during the population initialization stage and during the main search. The performance of the techniques has been measured based on various experiments, these techniques have been evaluated against each other, and against the particle swarm optimization algorithm alone taking into account the classification accuracy and training runtime. The results show that the use of imaging techniques produces better results.

Cite This Paper

Anes A. Abbas, Nabil M. Hewahi, " Exploring the Effect of Imaging Techniques Extension to PSO on Neural Networks", International Journal of Image, Graphics and Signal Processing(IJIGSP), Vol.12, No.2, pp. 9-18, 2020. DOI: 10.5815/ijigsp.2020.02.02

Reference

[1]A. Abbas and N. Hewahi, Imaging the search space: A nature-inspired metaheuristic extension, Evolutionary Intelligence, in press, 2019.

[2]R. Ata, Artificial neural networks applications in wind energy systems: a review, Renewable and Sustainable Energy Reviews, 49, pp.534-562, 2015.

[3]C. Blum, and A. Roli, Metaheuristics in combinatorial optimization: Overview and conceptual comparison, ACM Computing Surveys (CSUR) 35(3), pp.268-308, 2003.

[4]C. Blum, J. Puchinger, G.R. Raidl and A. Roli, A brief survey on hybrid metaheuristics, Proceedings of BIOMA, pp.3-18, 2010.

[5]M. Bogdanović, On some basic concepts of genetic algorithms as a meta-heuristic method for solving of optimization problems,  Journal of Software Engineering and Applications Vol.4 No.8,pp 482-486  ,2011.

[6]M. Dorigo, Optimization, learning and natural algorithms (in Italian), Ph.D. Thesis, Dipartimento di Elettronica, Politecnico di Milano, Italy, 1992.

[7]M. Dorigo and C. Blum, Ant colony optimization theory: A survey, Theoretical Computer Science, vol. 344, issues 2–3, pp.243-278, 2005.

[8]D. Dua and T.E. Karra, UCI machine learning repository [http://archive.ics.uci.edu/ml], Irvine, CA: University of California, School of Information and Computer Science, 2017.

[9]R. Eberhart and J. Kennedy, A new optimizer using particle swarm theory, In Proceedings of the sixth international symposium on micro machine and human science, Vol. 1, pp. 39-43, 1995.

[10]I. Fister Jr, X. Yang, I. Fister, J. Brest and D. Fister, A brief review of nature-inspired algorithms for optimization, arXiv preprint arXiv:1307.4186, 2013.

[11]D. Goldberg, Genetic algorithms in search, optimization, and machine learning. Reading: Addison-Wesley, 1989.

[12]N. Hewahi, E. Abu Hamra, A hybrid approach based on genetic algorithm and particle swarm optimization to improve neural network classification, Journal of Information Technology Research, Vol. 10, issue 3, pp 48-68, 2017.   

[13]N.  Hewahi and Z. J. Jaber, A biology inspired algorithm to mitigate the local minima problem and improve the classification in neural networks, International Journal of Computing and Digital Systems, Vol.8, issue 2, pp 253-363, 2019.

[14]N.  Hewahi, Particle swarm optimization for hidden markov model, International Journal of Knowledge and Systems Science, vol. 6, issue 2, pp 1-12, 2015.

[15]B. Kazimipour, X. Li and A.K. Qin, A review of population initialization techniques for evolutionary algorithms, in Evolutionary Computation (CEC), 2014 IEEE Congress, (2014, July), pp. 2585-2592, 2014.

[16]J. Kennedy, and R.C. Eberhart, particle swarm optimization, Proc. IEEE Int. Conf. on N.N., pp. 1942-1948, 1995.

[17]A. Khachaturyan, S. Semenovskaya, and B. Vainshtein, The thermodynamic approach to the structure analysis of crystals, . Acta Crystallographica. 37 (A37): pp.742–754.1981.  

[18]H. Maaranen, K. Miettinen and M.M. Mäkelä, Quasi-random initial population for genetic algorithms, Computers & Mathematics with Applications, 47(12), pp.1885-1895, 2004.

[19]J. McCaffrey, Neural network training using particle swarm optimization, https://visualstudiomagazine.com/articles/2013/12/01/neural-network-training-using-particle-swarm-optimization.aspx, (2013, December 18).

[20]W. Pan, K. Li,M. Wang, J. Wang and B. Jiang, Adaptive randomness: a new population initialization method, Mathematical Problems in Engineering, 2014.

[21]S. Rahnamayan, H. Tizhoosh and M. Salama, A novel population initialization method for accelerating evolutionary algorithms, Computers & Mathematics with Applications 53(10), pp. 1605-1614, 2007.

[22]M. Richards and D. Ventura, Choosing a starting configuration for particle swarm optimization, In IEEE Int. Joint. Conf. Neural,Vol. 3, pp. 2309-2312, 2004.

[23]D.Rini, S.M. Shamsuddin and S.S. Yuhaniz, Particle swarm optimization: technique, system and challenges. International Journal of Computer Applications 14(1), pp.19-26, 2011.

[24]R. Rojas, The backpropagation algorithm. In Neural networks, Springer, Berlin, Heidelberg, pp.  149-182, 1996.

[25]J. Sadeghi, S.  Sadeghi, Niaki, S.  T.  Akhavan, , Optimizing a hybrid vendor-managed inventory and transportation problem with fuzzy demand: An improved particle swarm optimization algorithm,  Information Sciences. 272, pp. 126–144, 2007. 

[26]D. Sedighizadeh and E. Masehian, Particle swarm optimization methods, Taxonomy and Applications International, Journal of Computer Theory and Engineering, vol. 1, no. 5, pp. 1793-8201, 2009.

[27]T Ting, X. Yang, S. Cheng and K. Huang, Hybrid metaheuristic algorithms: past, present, and future, in Recent Advances in Swarm Intelligence and Evolutionary Computation, Springer, Cham, pp. 71-83, 2015.

[28]I Trelea, The particle swarm optimization algorithm: convergence analysis and parameter selection, Information Processing Letters 85(6), pp.317-325, 2003.

[29]Q. Xu and Y. Li, Error analysis and optimal design of a class of translational parallel kinematic machine using particle swarm optimization, Robotica, 27(1), pp. 67-78, 2009.

[30]X.-S. Yang, A New Metaheuristic Bat-Inspired Algorithm, in Nature Inspired Cooperative Strategies for Optimization (NICSO 2010), Studies in Computational Intelligence 284, Edited J. R. González, D. A. Pelta, C. Cruz, G. Terrazas, and N. Krasnogor, Springer-Verlag, Berlin Heidelberg (2010), pp. 65-74.

[31]O. Yugay, I. Kim, B. Kim and F.I. Ko, Hybrid genetic algorithm for solving traveling salesman problem with sorted population. In Convergence and Hybrid Information Technology, 2008, ICCIT'08. Third International Conference on, IEEE, (2008, November), Vol. 2, pp. 1024-1028,2008.

[32]Y. Zhang, S. Wang and G. Ji, A comprehensive survey on particle swarm optimization algorithm and its applications, Mathematical Problems in Engineering, 2015.