Work place: Near East University, Lefkosa, via Mersin-10, North Cyprus
E-mail: kamil.dimililer@neu.edu.tr
Website:
Research Interests: Image Processing, Computer Architecture and Organization, Pattern Recognition, Computer systems and computational processes
Biography
Kamil Dimililer received B.Sc. and MSc. degrees in Electrical and Electronic Engineering from Near East University, N. Cyprus. After a short period of time, he started his PhD. and he received the PhD. Degree in the same field in 2014. He has been assigned as Deputy Chairman in Automotive Engineering for a short period of time and currently he is an Assistant Professor in the Department of Automotive Engineering as Chairman in 2015. Assist. Prof. Dr. Kamil Dimililer has 9 International Journal Publication, 16 Conference Publications, 3 Book Chapters in International Books and 7 publications in National Journals. His research interests include image processing, pattern recognition and intelligent systems.
By Oyebade Kayode Oyedotun Kamil Dimililer
DOI: https://doi.org/10.5815/ijigsp.2016.03.03, Pub. Date: 8 Mar. 2016
The ability of the human visual processing system to accommodate and retain clear understanding or identification of patterns irrespective of their orientations is quite remarkable. Conversely, pattern invariance, a common problem in intelligent recognition systems is not one that can be overemphasized; obviously, one's definition of an intelligent system broadens considering the large variability with which the same patterns can occur. This research investigates and reviews the performance of convolutional networks, and its variant, convolutional auto encoder networks when tasked with recognition problems considering invariances such as translation, rotation, and scale. While, various patterns can be used to validate this query, handwritten Yoruba vowel characters have been used in this research. Databases of images containing patterns with constraints of interest are collected, processed, and used to train and simulate the designed networks. We provide extensive architectural and learning paradigms review of the considered networks, in view of how built-in invariance is learned. Lastly, we provide a comparative analysis of achieved error rates against back propagation neural networks, denoising auto encoder, stacked denoising auto encoder, and deep belief network.
[...] Read more.Subscribe to receive issue release notifications and newsletters from MECS Press journals