Analyzing Test Performance of BSIT Students and Question Quality: A Study on Item Difficulty Index and Item Discrimination Index for Test Question Improvement

PDF (476KB), PP.1-11

Views: 0 Downloads: 0

Author(s)

Cris Norman P. Olipas 1,* Ruth G. Luciano 1

1. College of Information and Communications Technology, Nueva Ecija University of Science and Technology, Cabanatuan City, Philippines

* Corresponding author.

DOI: https://doi.org/10.5815/ijitcs.2024.03.01

Received: 6 Jan. 2024 / Revised: 17 Mar. 2024 / Accepted: 12 Apr. 2024 / Published: 8 Jun. 2024

Index Terms

Item Difficulty Index, Item Discrimination Index, Learning Outcomes, Question Enhancement, Systems Integration and Architecture, Test Performance

Abstract

This study presents a comprehensive assessment of the test performance of Bachelor of Science in Information Technology (BSIT) students in the System Integration and Architecture (SIA) course, coupled with a meticulous examination of the quality of test questions, aiming to lay the groundwork for enhancing the assessment tool. Employing a cross-sectional research design, the study involved 200 fourth-year students enrolled in the course. The results illuminated a significant discrepancy in scores between upper and lower student cohorts, highlighting the necessity for targeted interventions, curriculum enhancements, and assessment refinements, particularly for those in the lower-performing group. Further examination of the item difficulty index of the assessment tool unveiled the need to fine-tune certain items to better suit a broader spectrum of students. Nevertheless, the majority of items were deemed adequately aligned with their respective difficulty levels. Additionally, an analysis of the item discrimination index identified 25 items suitable for retention, while 27 items warranted revision, and 3 items were suitable for removal, as per the analysis outcomes. These insights provide a valuable foundation for improving the assessment tool, thereby optimizing its capacity to evaluate students' acquired knowledge effectively. The study's novel contribution lies in its integration of both student performance assessment and evaluation of assessment tool quality within the BSIT program, offering actionable insights for improving educational outcomes. By identifying challenges faced by BSIT students and proposing targeted interventions, curriculum enhancements, and assessment refinements, the research advances our understanding of effective assessment practices. Furthermore, the detailed analysis of item difficulty and discrimination indices offers practical guidance for enhancing the reliability and validity of assessment tools in the BSIT program. Overall, this research contributes to the existing body of knowledge by providing empirical evidence and actionable recommendations tailored to the needs of BSIT students, promoting educational quality and student success in Information Technology.

Cite This Paper

Cris Norman P. Olipas, Ruth G. Luciano, "Analyzing Test Performance of BSIT Students and Question Quality: A Study on Item Difficulty Index and Item Discrimination Index for Test Question Improvement", International Journal of Information Technology and Computer Science(IJITCS), Vol.16, No.3, pp.1-11, 2024. DOI:10.5815/ijitcs.2024.03.01

Reference

[1]N. J. Rao, & Banerjee, S. “Classroom Assessment in Higher Education”. Higher Education for the Future, Vol. 10, No. 1, pp. 11–30. 2023. DOI:10.1177/23476311221143231 
[2]W.Y. Lee. Assessment and Evaluation in Higher Education. MDPI eBooks. 2023. DOI:10.3390/books978-3-0365-6751-8
[3]Yixuan Wang. Country’s higher education assessment system. Frontiers in Educational Research, Vol. 4, No. 11. 2021. DOI:10.25236/fer.2021.041113
[4]Dawn Brown, Nathan Johnson. The importance of assessment and evaluation in higher education information technology projects. Lecture Notes in Computer Science. 2020. DOI:10.1007/978-3-030-60152-2_18 
[5]Matea Hanžek, Zdravka Biočina, Maja Martinović, Valentina Pirić. Disruptive times and higher education in economics and management: importance of dual assessment. International Scientific Conference EMAN. Economics & Management: How to Cope with Disrupted Times. 2021. DOI:10.31410/eman.2021.215 
[6]Tayseer Mansour, Shereen A. El Tarhouny, Manal Azab. Assessment of Multiple-Choice Questions Test Item Quality After Post Exam Item Analysis. International Journal of Scientific Research, Vol. 11, No. 6, pp. 32-34. 2022. DOI:10.36106/ijsr/1622478
[7]Hatim Lahza, Tammy G. Smith, Hassan Khosravi. Beyond item analysis: Connecting student behaviour and performance using e‐assessment logs. British Journal of Educational Technology, Vol. 54, No. 1, pp. 335–354. 2022. DOI:10.1111/bjet.13270
[8]Muhammad Afzaal. Theoretical Framework and Methodology. In Corpora and Intercultural Studies, pp. 39–57. Springer Nature Singapore. 2023. DOI:10.1007/978-981-19-9619-1_3 
[9]Scott Monroe. Item Response Theory. Item Response Theory. Routledge. 2022. DOI:10.4324/9781138609877-ree61-1 
[10]Ashraf, Z.A & Jaseem, K. Classical and Modern Methods in Item Analysis of Test Tools. International Journal of Research,2020.
[11]Yunxiao Chen, Xiaoou Li, Jingchen Liu, Zhiliang Ying. Item Response Theory -- A Statistical Framework for Educational and Psychological Measurement. arXiv: Methodology, 2021. DOI: 10.48550/arXiv.2108.08604
[12]Dale Whittington. Item Analysis. Item Analysis. Routledge. 2022. DOI:10.4324/9781138609877-ree211-1 
[13]Maya Marsevani. Item Analysis of Multiple-Choice Questions: An Assessment of Young Learners. English Review: Journal of English Education, Vol. 10, No. 2, pp. 401-408. 2022. DOI:10.25134/erjee.v10i2.6241
[14]Suleiman Sa’adu Matazu, Elizabeth Julius. Item Analysis: A Veritable Tool for Effective Assessment in Teaching and Learning. Journal of Education and Practice. Vol. 12, No. 21, pp. 22-28. 2021.
[15]Dharmendra Kumar, Raksha Jaipurkar, Atul Shekhar, Gaurav Sikri, V. Srinivas. Item analysis of multiple choice questions: A quality assurance test for an assessment tool. Medical Journal, Armed Forces India, Vol. 77, No. 1, pp. S85–S89. 2021. DOI:10.1016/j.mjafi.2020.11.007
[16]ASA Husin. Case Study of Difficulty Index and Discrimination Index Item of Year 5 English Language Summative Test (Comprehension)/ Kajian Kes Indeks Kesukaran dan Diskriminasi Item Ujian Sumatif Bahasa Inggeris (Pemahaman) Tahun 5. Sains Humanika. 2020. DOI:10.11113/sh.v12n2-2.1786
[17]Rapita Pratiwi, Septia Reflianti, Sura Antini, Ahmad Walid. Analysis of item Difficulty Index for midterm examinations in Junior High Schools 5 Bengkulu City. Asian Journal of Science Education, Vol. 3, No. 1, pp. 12–18. 2021. DOI:10.24815/ajse.v3i1.18895
[18]L. Tomak, Y. Bek, M. Cengiz. Graphical modeling for item difficulty in medical faculty exams. Nigerian Journal of Clinical Practice, Vol. 19, No. 1, pp. 58. 2016. DOI:10.4103/1119-3077.173701
[19]Cheng Shu-Chen, Chen Guan-Yu, Pan I-Chun. An Estimation Method of Item Difficulty Index Combined with the Particle Swarm Optimization Algorithm for the Computerized Adaptive Testing. Proceedings of the Annual Conference of JSAI. 2013. DOI:10.11517/pjsai.jsai2013.0_2c4ios3c6
[20]Y.S. Wijaya. Comparison of Item Difficulty Index National Exams Package (Analysis Using Item Response Theory). 2013
[21]Abdullah Faruk Kılıç, Ibrahim Uysal. To what extent are item discrimination values realistic? A new index for two-dimensional structures. International Journal of Assessment Tools in Education, Vol. 9, No. 3, pp. 728–740. 2022. DOI:10.21449/ijate.1098757
[22]Jari Metsämuuronen. Generalized Discrimination Index. International Journal of Educational Methodology, Vol. 6, No. 2, 237–257. 2020. DOI:10.12973/ijem.6.2.237
[23]Lihong Song, Wenyi Wang. An Attribute-Specific item Discrimination index in cognitive diagnosis. In Springer proceedings in mathematics & statistics, pp. 169–181. 2019. DOI:10.1007/978-3-030-01310-3_16
[24]Novi Maulina, Rima Novirianthy. Item Analysis and Peer-Review Evaluation of Specific Health Problems and Applied Research Block Examination. Jurnal Pendidikan Kedokteran Indonesia. Vol. 9, No. 2. 2020. DOI:10.22146/jpki.49006
[25]CY Fook, GK Sidhu, FW Yunus, AA Hamid, RA Hashim. Super Specific Tool for Item Analysis: A Preliminary Study, Vol. 3. Current Proceedings on Technology. 2013
[26]Johny Wu, Kevin King, Katie Witkiewitz, Sarah Jensen Racz, Robert J. McMahon. Item analysis and differential item functioning of a brief conduct problem screen. Psychological Assessment, Vol. 24, No. 2, pp. 444–454. 2012. DOI:10.1037/a0025831
[27]Chan Yuen Fook, Farhana Wan Yunus, Gurnam Kaur Sidhu. Teachers' knowledge on item analysis and item analysis software. In 2013 IEEE Conference on e-Learning, e-Management and e-Services, pp. 30-33. Kuching, Malaysia. 2013. DOI:10.1109/IC3e.2013.6735961
[28]John R. Dickinson. A New Statistic for Item Analysis. Developments in Marketing Science: Proceedings of the Academy of Marketing Science, 206. 2014. DOI:10.1007/978-3-319-11761-4_97
[29]Moushir M. El-Bishouty, Ting Wen Chang, Kinshuk, Sabine Graf. A framework for analyzing course contents in learning management systems with respect to learning styles. In G. Biswas, et al. (Eds.), 20th International Conference on Computers in Education (ICCE 2012), pp. 91–95. 2012.
[30]Byung-Yeol Park, Todd Campbell, Miriah Kelly, Ron Gray, Chester Arnold, Cary Chadwick, Laura M. Cisneros, David Dickson, David M. Moss, Laura Rodriguez, John C. Volin, Michael R. Willig. Improving NGSS focused model-based learning curriculum through the examination of students’ experiences and iterated models. Research in Science & Technological Education, Vol. 41, No. 3, pp. 983–1007. 2021. DOI:10.1080/02635143.2021.1978962
[31]Leoniek Wijngaards-de Meij, Sigrid Merx. Improving curriculum alignment and achieving learning goals by making the curriculum visible. International Journal for Academic Development, Vol. 23, No. 3, pp. 219–231. 2018. DOI:10.1080/1360144x.2018.1462187
[32]Manori Amarasekera, Paul S. Noakes, Brian D. Power. Reviewing the preclinical curriculum in a Problem Based Learning driven medical program: challenges and strategies. MedEdPublish, Vol. 8, No. 1. 2019. DOI:10.15694/mep.2019.000005.1
[33]Jeffrey L. Ezell, Diane Lending, S.E Kruck, Thomas W. Dillon, Jeffrey L. May. A Plan to Improve Learning of Requirements Elicitation in an IS Curriculum. Conference: The 2016 ACM SIGMIS Conference. 2016.  DOI:10.1145/2890602.2890621
[34]Olusegun A. Sogunro. Quality instruction as a motivating factor in higher education. International Journal of Higher Education, Vol. 6, No. 4, pp. 173. 2017. DOI:10.5430/ijhe.v6n4p173
[35]Jika Smutna, Radim Farana. Understanding the quality concept in the higher education. Acta Montanistica Slovaca, Vol. 15, pp. 54-57. 2010.
[36]Gavriel Meirovich, Edward J. Romar. The difficulty in implementing TQM in higher education instruction. Quality Assurance in Education, Vol. 14, No. 4, pp. 324–337. 2006. DOI:10.1108/09684880610703938
[37]Ignacio Gil Pechuán, M. Pilar Conesa García, Antonio Navarro García. Concept mapping to improve higher education. In Springer eBooks, pp. 61–73. 2014. DOI:10.1007/978-3-319-04825-3_7