Analysis of Parametric & Non Parametric Classifiers for Classification Technique using WEKA

Full Text (PDF, 436KB), PP.43-49

Views: 0 Downloads: 0

Author(s)

Yugal kumar 1,* G. Sahoo 2

1. CSE/IT Dept, Hindu College of Engineering, Industrial Area, Sonepat, Haryana, India

2. Dept. of Information Technology, Birla Institute of Technology, Mesra, Ranchi, Jhrakhand, India

* Corresponding author.

DOI: https://doi.org/10.5815/ijitcs.2012.07.06

Received: 5 Sep. 2011 / Revised: 20 Jan. 2012 / Accepted: 11 Mar. 2012 / Published: 6 Jul. 2012

Index Terms

Bayesian Net, Logistic Regression, Multi layer perceptron, and Navie Bayes

Abstract

In the field of Machine learning & Data Mining, lot of work had been done to construct new classification techniques/ classifiers and lot of research is going on to construct further new classifiers with the help of nature inspired technique such as Genetic Algorithm, Ant Colony Optimization, Bee Colony Optimization, Neural Network, Particle Swarm Optimization etc. Many researchers provided comparative study/ analysis of classification techniques. But this paper deals with another form of analysis of classification techniques i.e. parametric and non parametric classifiers analysis. This paper identifies parametric & non parametric classifiers that are used in classification process and provides tree representation of these classifiers. For the analysis purpose, four classifiers are used in which two of them are parametric and rest of are non-parametric in nature.

Cite This Paper

Yugal kumar, G. Sahoo, "Analysis of Parametric & Non Parametric Classifiers for Classification Technique using WEKA", International Journal of Information Technology and Computer Science(IJITCS), vol.4, no.7, pp.43-49, 2012. DOI:10.5815/ijitcs.2012.07.06

Reference

[1]Sarle, Warren S. (1994), “Neural Networks and Statistical Models,” Proceedings of the Nineteenth Annual SAS Users Group International Conference, April, pp 1-13.

[2]S.H.Musavi and M.Golabi (2008), “Application of Artificial Neural Networks in the River Water Quality Modeling: Karoon River,Iran”, Journal 0f Applied Sciences, Asian Network for Scientific Information, pp. 2324- 2328. 

[3]M.J. Diamantopoulou, V.Z. Antonopoulos and D.M. Papamichai (jan 2005), “The Use of a Neural Network Technique for the Prediction of Water Quality Parameters of Axios River in Northern Greece”, Journal 0f Operational Research, Springer-Verlag, pp. 115-125. 

[4]Buntine, W. (1991). Theory refinement on Bayesian networks. In B. D. D’Ambrosio, P. Smets, & P.P. Bonissone (Eds.), In Press of Proceedings of the Seventh Annual Conference on Uncertainty Artificial Intelligent (pp. 52-60). San Francisco, CA 

[5]Daniel Grossman and Pedro Domingos (2004). Learning Bayesian Network Classifiers by Maximizing Conditional Likelihood. In Press of Proceedings of the 21st International Conference on Machine Learning, Banff, Canada.

[6]D.Marquardt (1963), “An Algorithm for Least Squares Estimation of Non-Linear Parameter”, J. Soc. Ind. Appl. Math., vol. 11 pp 431- 441. 

[7]L.Fausett (1994), “Fundamentals of Neural Networks Architecture.Algorithms and Applications”, Pearson Prentice Hall, USA. 

[8]Ian h.Written and Eibe Frank.Data Mining Practical Machine Learning Tools and Techniques, Second Edition,Elsevier.

[9]Janikow, C. Z. (1998). "Fuzzy decision trees: issues and methods." IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics, 28(1): 1-14.

[10]Dutton,D. & Conroy, G. (1996),”A review of machine learning”, Knowledge Engineering Review 12: 341-367.

[11]De Mantaras & Armengol E. (1998),”Machine learning from example: Inductive and Lazy methods”, Data & Knowledge Engineering 25: 99-123.

[12]J. R. Quinlan (1986), “Induction of decision trees”,Machine learning 1, pp 81-106.

[13]J.R.Quinlan (1993),C 4.5: Programs for machine learning, morgan kaufmann,san francisco.

[14]M. S. Hung, M. Hu, M. Shanker (2001),”Estimating breast cancer risks using neural network”, International journal of operational research52, 1-10.

[15]R. Setiono, L. C. K. Hui (1995), “Use of quasi- Newton method in a feed forward neural network construction algorithm”, IEEE Trans. Neural Network 6( 1) pp. no. 273-277.

[16]H. Ishibuchi, K. Nozaki, N. Yamamoto, H. Tanaka (1995), “Selecting fuzzy if then rules for classification roblems using genetic algorithm”, IEEE Trans. Fuzzy System 3 (3) 260 – 270.

[17]D. B. Fogel, E. C. Wason , E. M. Boughton, V. W. Porto, P. J. Angeline (1998), “Linear and Neural Model for classifying breast masses”,IEEE Trans. On Medical imaging 17 (3) 485-488.

[18]G. Fung, O. L. Mangasarian (Oct. 1999), “Semi supervised support vector machines for unlabeled data classification”, Technical Report, Dept. of Computer science, University of Wisconsim.

[19]C. H. Lee, D. G. Shin (1999), “A multi strategy approach to classification learning in database”, Data Knowledge Engg. 31, 67-93.

[20]http://www.mcw.edu/FileLibrary/Groups/Biostatistics/Publicfiles/DataFromSection/DataFromSectionTXT/ Data_from _section_1.14.txt

[21]Fawcett , T (2006),”An introduction to ROC analysis”, Pattern Recognit Lett, Vol.No.27:861–874.

[22]Melville, P.; Yang, S.M.; Saar-Tsechansky, M. & Mooney R (2005),”Active learning for probability estimation using Jensen-Shannon divergence”. In Proceedings of the European Conference on Machine Learning (ECML), pages 268–279. Springer.

[23]Landis, J.R.; & Koch, G.G. (1977). The measurement of observer agreement for categorical data. Biometrics 33: 159–174.