A Unified Framework for Systematic Evaluation of ABET Student Outcomes and Program Educational Objectives

Full Text (PDF, 496KB), PP.1-6

Views: 0 Downloads: 0

Author(s)

Imtiaz Hussain Khan 1,*

1. Department of Computer Science, Faculty of Computing and Information Technology King Abdulaziz University, Saudi Arabia

* Corresponding author.

DOI: https://doi.org/10.5815/ijmecs.2019.11.01

Received: 19 Sep. 2019 / Revised: 29 Sep. 2019 / Accepted: 7 Oct. 2019 / Published: 8 Nov. 2019

Index Terms

Program Educational Objectives, Learning Outcomes, Outcome Assessment, Outcome Attainment, Assessment Tools, ABET Evaluation

Abstract

Assessment and evaluation of Program Educational Objectives (PEOs) and Student Outcomes (SOs) is a challenging task. In this paper, we present a unified framework, which has been developed over a period of more than eight years, for the systematic assessment and evaluation of PEOs and SOs. The proposed framework is based on a balance sampling approach that thoroughly covers PEO/SO assessment and evaluation and also minimizes human effort. This framework is general but to prove its effectiveness, we present a case study where this framework is successfully adopted by our undergraduate computer science program in the department of computer science at King Abdulaziz University, Jeddah. The robustness of the proposed framework is ascertained by an independent evaluation by ABET who awarded us full six years accreditation without any comments or concerns. The most significant value of our proposed framework is that it provides a balanced sampling mechanism for assessment and evaluations of PEOs/SOs that can be adapted by any program seeking ABET accreditation.

Cite This Paper

Imtiaz Hussain Khan, " A Unified Framework for Systematic Evaluation of ABET Student Outcomes and Program Educational Objectives", International Journal of Modern Education and Computer Science(IJMECS), Vol.11, No.11, pp. 1-6, 2019. DOI:10.5815/ijmecs.2019.11.01

Reference

[1]H. Braun, A. Kanjee, E. Bettinger and M. Kremer, Improving Education Through Assessment, Innovation, and Evaluation, Cambridge: American Academy of Arts and Sciences, 2006.
[2]L. Jin, “A Research on the Quality Assessment in Higher Education Institutions,” in Proceedings of E-Product, E-Service and E-Entertainment Conference, Henan, China, 2010.
[3]S. Barney, M. Khurum, K. Petersen, M. Unterkalmsteiner and R. Jabangwe, “Improving Students With Rubric-Based Self-Assessment and Oral Feedback,” IEEE Transactions on Education, vol. 55, no. 3, pp. 319-325, 2012.
[4]P. B. Crilly and R. J. Hartnett, “Ensuring Attainment of ABET Criteria 4 and Maintaining Continuity for Programs with Moderate Faculty Turnover,” in Proceedings of the 2015 ASEE Northeast Section Conference, Boston, 2015.
[5]H. Wimmer, L. Powell, L. Kilgus and C. Force, “Improving Course Assessment via Web-based Homework,” International Journal of Online Pedagogy and Course Design, vol. 7, no. 2, pp. 1-19, 2017.
[6]T.-S. Chou, “Course Design and Project Evaluation of a Network Management Course Implemented in On-Campus and Online Classes,” International Journal of Online Pedagogy and Course Design, vol. 8, no. 2, pp. 44-56, 2018.
[7]C. Robles, “Evaluating the use of Toondoo for collaborative e-learning of selected pre-service teachers.,” International journal of modern education and computer science., vol. 9, no. 11, pp. 25-32, 2017.
[8]Rajak, A. K. Shrivastava, S. Bhardwaj and A. K. Tripathi, “Assessment and attainment of program educational objectives for post graduate courses.,” International journal of modern education and computer science., vol. 11, no. 2, pp. 26-32, 2019.
[9]ABET, “Criteria for Accrediting Computing Programs,” 2019. [Online]. Available: https://www.abet.org/accreditation/accreditation-criteria/criteria-for-accrediting-computing-programs-2018-2019/. [Accessed June 2019].
[10]J. P. Somervell, “Assessing Student Outcomes with Comprehensive Examinations: A Case Study,” in Proceedings of Frontiers in Education: CS and CE, Las Vegas, NV, USA, 2015.
[11]J. König, S. Blömeke, L. Paine, W. H. Schmidt and F.-J. Hsieh, “General Pedagogical Knowledge of Future Middle School Teachers: On the Complex Ecology of Teacher Education in the United States, Germany, and Taiwan.,” Teacher Education, vol. 62, no. 2, pp. 188-201, 2011.
[12]J. König, R. Ligtvoet, S. Klemenz and M. Rothland, “Effects of Opportunities to Learn in Teacher Preparation on Future Teachers’ General Pedagogical Knowledge: Analyzing Program Characteristics and Outcomes.,” Studies in Educational Evaluation, vol. 53, pp. 122-133, 2017.
[13]E. Smerdon, “An Action Agenda for Engineering Curriculum Innovation,” in 11th IEEE-USA Biennial Careers Conference, San Jose, 2000.
[14]L. J. Shuman, M. Besterfield-Sarce and J. McGourty, “The ABET Professional Skills - Can They Be Taught? Can They Be Assessed?,” Engineering Education, vol. 94, pp. 41-55, 2005.
[15]M. Danaher, K. Schoepp and A. A. Kranov, “A New Approach forAassessing ABET’s Professional Skills in Computing,” World Transactions on Engineering and Technology Education, vol. 14, no. 3, pp. 355-360, 2016.
[16]S. Bloom, M. D. Engelhart, E. J. Furst, W. H. Hill and D. R. Krathwohl, Taxonomy of Educational Objectives: The Classification of Educational Goals. Handbook I: Cognitive Domain., New York: David McKay Company, 1956.
[17]J. Biggs, “Aligning Teaching and Assessment to Curriculum Objectives,” Imaginative Curriculum Project, Learning and Teaching Support Network, Generic Centre, 2003.
[18]J. Biggs and C. Tang, Teaching For Quality Learning at University, Maidenhead: McGraw-Hill and Open University Press, 2011.
[19]R. M. v. d. Lans, W. J. v. d. Grift, V. v. Klaas and M. Fokkens-Bruinsma, “"Once is Not Enough: Establishing Reliability Criteria for Feedback and Evaluation Decisions Based on Classroom Observations,” Studies in Educational Evaluation, vol. 50, pp. 88-95, 2016.
[20]K. D. Stephan, “All This and Engineering Too: A History of Accreditation Requirements,” IEEE Technology & Society Magazine, vol. 21, no. 3, pp. 8-15, 2002.
[21]Damaj and J. Yousafzai, “Simple and Accurate Student Outcomes Assessment: A Unified Approach Using Senior Computer Engineering Design Experiences,” in IEEE Global Engineering Education Conference, 2016.
[22]S. Gibb, “Soft Skills Assessment: Theory Development and the Research Agenda.,” International Journal of Lifelong Education., vol. 33, no. 4, 2014.
[23]A. Aworanti, M. B. Taiwo and O. I. Iluobe, “Validation of Modified Soft Skills Assessment Instrument (MOSSAI) for Use in Nigeria.,” Universal Journal of Educational Research., vol. 3, no. 11, pp. 847-861, 2015.