Elham Masoumi

Work place: Department of Computer Engineering, Qazvin Branch, Islamic Azad University, Qazvin, Iran

E-mail:

Website:

Research Interests: Computer systems and computational processes, Distributed Computing, Data Mining, Data Structures and Algorithms

Biography

Elham Masoumi: She received M.Sc. in Computer Engineering from Department of Computer Engineering and Information Technology, Qazvin Branch, Islamic Azad University of Iran. She currently has lectures in computer science at University of Applied Sciences. Her current research interests include distributed data mining and classifiers combination.

Author Articles
Construction of High-accuracy Ensemble of Classifiers

By Hedieh Sajedi Elham Masoumi

DOI: https://doi.org/10.5815/ijitcs.2014.05.01, Pub. Date: 8 Apr. 2014

There have been several methods developed to construct ensembles. Some of these methods, such as Bagging and Boosting are meta-learners, i.e. they can be applied to any base classifier. The combination of methods should be selected in order that classifiers cover each other weaknesses. In ensemble, the output of several classifiers is used only when they disagree on some inputs. The degree of disagreement is called diversity of the ensemble. Another factor that plays a significant role in performing an ensemble is accuracy of the basic classifiers. It can be said that all the procedures of constructing ensembles seek to achieve a balance between these two parameters, and successful methods can reach a better balance. The diversity of the members of an ensemble is known as an important factor in determining its generalization error. In this paper, we present a new approach for generating ensembles. The proposed approach uses Bagging and Boosting as the generators of base classifiers. Subsequently, the classifiers are partitioned by means of a clustering algorithm. We introduce a selection phase for construction the final ensemble and three different selection methods are proposed for applying in this phase. In the first proposed selection method, a classifier is selected randomly from each cluster. The second method selects the most accurate classifier from each cluster and the third one selects the nearest classifier to the center of each cluster to construct the final ensemble. The results of the experiments on well-known datasets demonstrate the strength of our proposed approach, especially applying the selection of the most accurate classifiers from clusters and employing Bagging generator.

[...] Read more.
Other Articles