Armand F. Kana

Work place: Department of Computer Science, Ahmadu Bello University, Zaria, Nigeria

E-mail: donfackkana@gmail.com

Website:

Research Interests: Computer systems and computational processes, Computational Learning Theory, Data Structures and Algorithms, Formal Methods, Mathematics of Computing

Biography

Armand F. Donfack Kana received the B.Sc. degree in Computer Science from University of Ilorin, Nigeria, M.Sc. and Ph.D. degrees in Computer Science from University of Ibadan, Nigeria. He is currently a Senior Lecturer in Computer Science at the Department of Computer Science, Ahmadu Bello University, Zaria, Nigeria. His current research interests include Knowledge Representation and Reasoning, Machine Learning, Formal Ontologies and Soft Computing.

Author Articles
An Enhanced Adaptive k-Nearest Neighbor Classifier Using Simulated Annealing

By Anozie Onyezewe Armand F. Kana Fatimah B. Abdullahi Aminu O. Abdulsalami

DOI: https://doi.org/10.5815/ijisa.2021.01.03, Pub. Date: 8 Feb. 2021

The k-Nearest Neighbor classifier is a non-complex and widely applied data classification algorithm which does well in real-world applications. The overall classification accuracy of the k-Nearest Neighbor algorithm largely depends on the choice of the number of nearest neighbors(k). The use of a constant k value does not always yield the best solutions especially for real-world datasets with an irregular class and density distribution of data points as it totally ignores the class and density distribution of a test point’s k-environment or neighborhood. A resolution to this problem is to dynamically choose k for each test instance to be classified. However, given a large dataset, it becomes very tasking to maximize the k-Nearest Neighbor performance by tuning k. This work proposes the use of Simulated Annealing, a metaheuristic search algorithm, to select optimal k, thus eliminating the prospect of an exhaustive search for optimal k. The results obtained in four different classification tasks demonstrate a significant improvement in the computational efficiency against the k-Nearest Neighbor methods that perform exhaustive search for k, as accurate nearest neighbors are returned faster for k-Nearest Neighbor classification, thus reducing the computation time.

[...] Read more.
Other Articles