Design and Validity of an Instrument to Measure Digital Literacy among Pre-service Teachers involved in Inclusive Education

Full Text (PDF, 556KB), PP.84-96

Views: 0 Downloads: 0

Author(s)

Wu Miaomiao 1 Dorothy De Witt 1 Nor Nazrina Mohamad Nazry 1,* Norlidah Alias 1 Lee Leh Hong 1 Alijah Ujang 2

1. Department of Curriculum and Instructional Technology, Faculty of Education, University of Malaya, Kuala Lumpur, 50603, Malaysia

2. Society of Community Rehabilitation Center, Selangor, 53100, Malaysia

* Corresponding author.

DOI: https://doi.org/10.5815/ijmecs.2024.01.07

Received: 6 Jul. 2023 / Revised: 19 Aug. 2023 / Accepted: 16 Oct. 2023 / Published: 8 Feb. 2024

Index Terms

Digital literacy, inclusive education, pre-service teachers, validity, reliability

Abstract

Assessing pre-service teachers’ digital literacy is challenging, particularly in inclusive education. Reliable and valid testing instruments are required to measure the digital literacy pre-service teachers possess in inclusive education. The entire research process comprises three phases. The first stage was to develop the assessment instrument, the second stage was to validate its content validity, and a pilot study was then conducted to test the reliability and construct validity of the instrument. The results of this study showed that item-level and scale-level content validity scores were both 1.0. The Kaiser-Meyer-Olkin is equal to 0.865. Five factors were extracted, explaining 54.40% of the total variance. The model fits were also all satisfactory. Standardized factor loadings of the instrument’ s 28 items were above 0.5. The values of Cronbach’s are higher than 0.7 for the five factors and the whole instrument. It can be summarized that the instrument had good reliability and validity and can be used to assess the digital literacy of pre-service teachers in inclusive education. There has been research into developing tools to evaluate the digital literacy of pre-service teachers. Still, few studies have addressed pre-service teachers of inclusive education, and this study fills this research gap. The subsequent phase involves evaluating it using a more extensive sample.

Cite This Paper

Wu Miaomiao, Dorothy De Witt, Nor Nazrina Mohamad Nazry, Norlidah Alias, Lee Leh Hong, Alijah Ujang, "Design and Validity of an Instrument to Measure Digital Literacy among Pre-service Teachers involved in Inclusive Education", International Journal of Modern Education and Computer Science(IJMECS), Vol.16, No.1, pp. 84-96, 2024. DOI:10.5815/ijmecs.2024.01.07

Reference

[1]Brugia, M., & Zukersteinova, A. 2019. Continuing Vocational Training in EU Enterprises. Publications Office of the European Union.
[2]Hassan, M. M., & Mirza, T. 2021. The Digital Literacy in Teachers of the Schools of Rajouri (J&K)-India: Teachers Perspective. International Journal of Education and Management Engineering, 11(1), pp. 28-40.
[3]Ilina, I., Grigoryeva, Z., Kokorev, A., Ibrayeva, L., & Bizhanova, K. 2019. Digital Literacy of the Teacher as a Basis for the Creation of a Unified Information Educational Space. International Journal of Civil Engineering and Technology, 10(1), pp. 1686-1693.
[4]Li, M., & Yu, Z. 2022. Teachers’ Satisfaction, Role, and Digital Literacy During the COVID-19 Pandemic. Sustainability, 14(3), p. 1121. https://doi.org/10.3390/su14031121
[5]Rapanta, C., Botturi, L., Goodyear, P., Guàrdia, L., & Koole, M. 2020. Online University Teaching During and After the Covid-19 Crisis: Refocusing teacher presence and learning activity. Postdigital science and education, 2, pp. 923-945. https://doi.org/10.1007/s42438-020-00155-y
[6]Martin, A., & Grudziecki, J. 2006. Dig Eu Lit: Concepts and Tools for Digital Literacy Development. Innovation in teaching and learning in information and computer sciences, 5(4), pp. 249-267. https://doi.org/10.11120/ital.2006.05040249
[7]Corona, A. G.-F., Martínez-Abad, F., & Rodríguez-Conde, M.-J. 2017. Evaluation of Digital Competence in Teacher Training. 20. pp, 1-5. Proceedings of the 5th International Conference on Technological Ecosystems for Enhancing Multiculturality. https://doi.org/10.1145/3144826.3145367
[8]Cabero Almenara, J., Barroso Osuna, J. M., Gutiérrez Castillo, J. J., & Palacios-Rodríguez, A. d. P. 2020. Validación del Cuestionario de Competencia Digital para Futuros Maestros Mediante Ecuaciones Estructurales. SOCIEDAD ESPAÑOLA DE PEDAGOGÍA. 72 (2), PP. 45-63. https://doi.org/10.13042/Bordon.2020.73436
[9]Cretu, D. M., & Morandau, F. 2020. Initial teacher education for inclusive education: A bibliometric analysis of educational research. Sustainability, 12(12), p. 4923. https://doi.org/10.3390/su12124923
[10]Medina-García, M., Higueras-Rodríguez, L., García-Vita, M. d. M., & Doña-Toledo, L. 2021. ICT, Disability, and Motivation: Validation of a Measurement Scale and Consequence Model for Inclusive Digital Knowledge. International Journal of Environmental Research and Public Health, 18(13), p. 6770. https://doi.org/10.3390/ijerph18136770
[11]DeWitt, D., Alias, N., Ibrahim, Z., Shing, N. K., & Rashid, S. M. M. 2015. Design of a Learning Module for the Deaf in a Higher Education Institution Using Padlet. Procedia-Social and Behavioral Sciences, 176, pp. 220-226. https://doi.org/10.1016/j.sbspro.2015.01.464
[12]McGarr, O., & McDonagh, A. 2021. Exploring the Digital Competence of Pre-service Teachers on Entry onto an Initial Teacher Education Programme in Ireland. Irish Educational Studies, 40(1), pp. 115-128. https://doi.org/10.1080/03323315.2020.1800501
[13]Reisoğlu, İ., & Çebi, A. 2020. How Can the Digital Competences of Pre-service Teachers be Developed? Examining a Case Study through the Lens of DigComp and DigCompEdu. Computers & Education, 156, p. 103940. https://doi.org/10.1016/j.compedu.2020.103940
[14]Varela-Ordorica, S. A., & Valenzuela-González, J. R. 2020. Use of Information and ommunication Technologies as a Transversal Competence in Teacher Training. Revista Electrónica Educare, 24(1), pp. 172-191. https://doi.org/10.15359/ree.24-1.10
[15]Walton, E., & Rusznyak, L. 2020. Cumulative Knowledge-Building for Inclusive Education in Initial Teacher Education. European Journal of Teacher Education, 43(1), pp. 18-37. https://doi.org/10.1080/02619768.2019.1686480
[16]Stavroulia, K. E., Christofi, M., Baka, E., Michael-Grigoriou, D., Magnenat-Thalmann, N., & Lanitis, A. 2019. Assessing the Emotional Impact of Virtual Reality-based Teacher Training. The International Journal of Information and Learning Technology. 36(3), pp. 192-217. https://doi.org/10.1108/IJILT-11-2018-0127
[17]Tveiterås, N. C., & Madsen, S. S. 2022. From Tools to Complexity?—A Systematic Literature Analysis of Digital Competence Among Pre-service Teachers in Norway. Digital Literacy for Teachers, pp. 345-389. https://doi.org/10.1007/978-981-19-1738-7_18
[18]Falloon, G. 2020. From digital literacy to digital competence: the teacher digital competency (TDC) framework. Educational Technology Research and Development, 68(5), pp. 2449-2472. https://doi.org/ 10.1007/s11423-020-09767-4
[19]De León, L., Corbeil, R., & Corbeil, M. E. 2023. The development and validation of a teacher education digital literacy and digital pedagogy evaluation. Journal of Research on Technology in Education, 55(3), pp. 477-489. https://doi.org/10.1080/15391523.2021.1974988
[20]Magassa, L. 2020. " I am not computer savvy": A look into the everyday digital literacy levels of formerly incarcerated people using a novel holistic digital literacy framework. University of Washington.
[21]Lazonder, A. W., Walraven, A., Gijlers, H., & Janssen, N. 2020. Longitudinal assessment of digital literacy in children: Findings from a large Dutch single-school study. Computers & Education, 143, p. 103681. https://doi.org/10.1016/j.compedu.2019.103681
[22]García-Vandewalle García, J. M., García-Carmona, M., Trujillo Torres, J. M., & Moya Fernández, P. 2021. Analysis of digital competence of educators (DigCompEdu) in teacher trainees: the context of Melilla, Spain. Technology, Knowledge and Learning, pp. 1-28. https://doi.org/10.1007/s10758-021-09546-x
[23]Lázaro-Cantabrana, J., Usart-Rodríguez, M., & Gisbert-Cervera, M. 2019. Assessing teacher digital competence: The construction of an instrument for measuring the knowledge of pre-service teachers. Journal of New Approaches in Educational Research (NAER Journal), 8(1), pp. 73-78. https://doi.org/ 10.7821/naer.2019.1.370
[24]Miguel-Revilla, D., Martínez-Ferreira, J. M., & Sánchez-Agustí, M. 2020. Assessing the digital competence of educators in social studies: An analysis in initial teacher training using the TPACK-21 model. Australasian Journal of Educational Technology, 36(2), pp. 1-12. https://doi.org/10.14742/ajet.5281
[25]Masalimova, A. R., Erdyneeva, K. G., Kislyakov, A. S., Sizova, Z. M., Kalashnikova, E., & Khairullina, E. R. 2022. Validation of the Scale on Pre-Service Teachers' Digital Competence to Assist Students with Functional Diversity. Contemporary Educational Technology, 14(4). https://doi.org/10.30935/cedtech/12301
[26]Brazal, I. K., Monje, P. M. M., & Urrutxi, L. D. 2022. Teachers' digital competence and inclusive education at school: An analysis of teacher attitudes. https://doi.org/10.18844/cjes.v17i9.7031
[27]Hidayat, M. L., Astuti, D. S., Sumintono, B., Meccawy, M., & Khanzada, T. J. 2023. Digital Competency Mapping Dataset of Pre-service Teachers in Indonesia. Data in Brief, p. 109310. https://doi.org/ 10.1016/j.dib.2023.109310
[28]Rodríguez-García, A.-M., Cardoso-Pulido, M.-J., De la Cruz-Campos, J.-C., & Martínez-Heredia, N. 2022. Communicating and Collaborating with Others through Digital Competence: A Self-Perception Study Based on Teacher Trainees’ Gender. Education Sciences, 12(8), p. 534. https://doi.org/10.3390/educsci12080534
[29]Luik, P., Taimalu, M., & Suviste, R. 2018. Perceptions of technological, pedagogical and content knowledge (TPACK) among pre-service teachers in Estonia. Education and Information Technologies, 23, pp. 741-755. https://doi.org/10.1007/s10639-017-9633-y
[30]Deng, G., & Zhang, J. 2023. Technological pedagogical content ethical knowledge (TPCEK): The development of an assessment instrument for pre-service teachers. Computers & Education, 197, p. 104740. https://doi.org/10.1016/j.compedu.2023.104740
[31]Kotzebue, L. v., Meier, M., Finger, A., Kremser, E., Huwer, J., Thoms, L.-J., Becker, S., Bruckermann, T., & Thyssen, C. 2021. The framework DiKoLAN (Digital competencies for teaching in science education) as basis for the self-assessment tool DiKoLAN-Grid. Education Sciences, 11(12), p. 775. https://doi.org/10.3390/educsci11120775
[32]Bitemirova, S., Zholdasbekova, S., Mussakulov, K., Anesova, A., & Zhanbirshiyev, S. 2023. Pre-service TVET Teachers' Digital Competence: Evidence from Survey Data. TEM Journal, 12(2). https://doi.org/10.18421/TEM122-64
[33]Cattaneo, A. A., Antonietti, C., & Rauseo, M. 2022. How Digitalised Are Vocational Teachers? Assessing Digital Competence in Vocational Education and Looking at its Underlying Factors. Computers & Education, 176, p. 104358. https://doi.org/10.1016/j.compedu.2021.104358
[34]Su, J., & Yang, W. 2022. Artificial Intelligence in Early Childhood Education: A Scoping Review. Computers and Education: Artificial Intelligence, p. 100049. https://doi.org/10.1016/j.caeai.2022.100049
[35]Viberg, O., Grönlund, Å., & Andersson, A. 2023. Integrating Digital Technology in Mathematics Education: a Swedish Case Study. Interactive Learning Environments, 31(1), pp. 232-243. https://doi.org/10.1080/10494820.2020.1770801
[36]García-Valcárcel Muñoz-Repiso, A., Casillas Martín, S., & Basilotta Gómez-Pablos, V. M. 2020. Validation of an Indicator Model (INCODIES) for Assessing Student Digital Competence in Basic Education. https://doi.org/10.7821/naer.2020.1.459
[37]Redecker, C., & Punie, Y. 2017. Digital Competence of Educators. Joint Research Centre-JRC: Luxembourg, p. 95. https://doi.org/10.2760/159770
[38]Cabero-Almenara, J., Romero-Tena, R., & Palacios-Rodríguez, A. 2020. Evaluation of Teacher Digital Competence Frameworks through Expert Judgement: The Use of the Expert Competence Coefficient. Journal of New Approaches in Educational Research (NAER Journal), 9(2), pp. 275-293. https://doi.org/10.7821/naer.2020.7.578
[39]Brazal, I. K., Monje, P. M. M., & Urrutxi, L. D. 2022. Teachers' Digital Competence and Inclusive Education at School: An Analysis of Teacher Attitudes.s. Cypriot Journal of Educational Science. 17(9), pp. 3314-3326. https://doi.org/10.18844/cjes.v17i9.7031
[40]European Commission/EACEA/Eurydice. 2019. Digital Education at School in Europe. Eurydice Report. Luxembourg: Publications Office of the European Union.
[41]Griful-Freixenet, J., Struyven, K., & Vantieghem, W. 2021. Exploring Pre-service Teachers’ Beliefs and Practices about Two Inclusive Frameworks: Universal Design for Learning and Differentiated Instruction. Teaching and Teacher Education, 107, p. 103503. https://doi.org/10.1016/j.tate.2021.103503
[42]Rose, D. H., & Meyer, A. 2002. Teaching Every Student in the Digital Age: Universal Design for Learning. Association for Supervision and Curriculum Development, 1703 N. Beauregard St., Alexandria, VA 22311-1714 (Product no. 101042: $22.95 ASCD members; $26.95 nonmembers).
[43]Rao, K., & Meo, G. (2016). Using Universal Design for Learning to Design Standards-based Lessons. Sage Open, 6(4), pp. 1-12.https://doi.org/10.1177/2158244016680688
[44]Haynes, S. N., Richard, D., & Kubany, E. S. 1995. Content Validity in Psychological Assessment: A Functional Approach to Concepts and Methods. Psychological assessment, 7(3), p. 238. https://doi.org/10.1037/1040-3590.7.3.238
[45]Waltz, C. F., & Bausell, B. R. 1981. Nursing Research: Design Statistics and Computer Analysis. Davis Fa.
[46]Yusoff, M. S. B. 2019. ABC of Content Validation and Content Validity Index Calculation. Education in Medicine Journal, 11(2), pp. 49-54. https://doi.org/10.21315/eimj2019.11.2.6
[47]Burton-Jones, A. 2009. Minimizing method bias through programmatic research. MIS quarterly, pp. 445-471. https://doi.org/10.2307/20650304
[48]Podsakoff, P. M., MacKenzie, S. B., Lee, J.-Y., & Podsakoff, N. P. 2003. Common method biases in behavioral research: a critical review of the literature and recommended remedies. Journal of applied psychology, 88(5), p. 879. https://doi.org/10.1037/0021-9010.88.5.879
[49]Baharuddin, M. F., Masrek, M. N., & Shuhidan, S. M. 2020. Content Validity of Assessment Instrument for Innovative Work Behaviour of Malaysian School Teachers. International Journal of Scientific and Technology Research, 9(4), pp. 1940-1946.
[50]Lynn, M. R. 1986. Determination and Quantification of Content Validity. Nursing Research, 35(6), pp. 382-386. https://doi.org/10.1097/00006199-198611000-00017
[51]Polit, D. F., Beck, C. T., & Owen, S. V. 2007. Is the CVI an Acceptable Indicator of Content Validity? Appraisal and Recommendations. Research in nursing & health, 30(4), pp. 459-467. https://doi.org/10.1002/nur.20199
[52]Hair, J. F., Risher, J. J., Sarstedt, M., & Ringle, C. M. 2019. When to Use and How to Report the Results of PLS-SEM. European business review, 31(1), pp. 2-24. https://doi.org/10.1108/EBR-11-2018-0203
[53]Coolican, H. 2018. Research Methods and Statistics in Psychology. Routledge.
[54]Sekaran, U., & Bougie, R. 2016. Research Methods for Business: A Skill Building Approach. john wiley & sons.
[55]Samuels, P. 2017. Advice on Exploratory Factor Analysis. URL: https://www.researchgate.net/publication/319165677_Advice_on_Exploratory_Factor_Analysis
[56]Rockwell, R. C. 1975. Assessment of Multicollinearity: The Haitovsky Test of the Determinant. Sociological Methods & Research, 3(3), pp. 308-320. https://doi.org/10.1177/004912417500300304
[57]Field, A. 2013. Discovering Statistics Using IBM SPSS Statistics. sage.
[58]MacCallum, R. C., Widaman, K. F., Zhang, S., & Hong, S. 1999. Sample Size in Factor Analysis. Psychological methods, 4(1), p. 84. https://doi.org/10.1037/1082-989X.4.1.84
[59]Ehido, A., Awang, Z., Halim, B. A., & Ibeabuchi, C. 2020. Establishing Valid and Reliable Measures for Organizational Commitment and Job Performance: An Exploratory Factor Analysis. International Journal of Social Sciences Perspectives, 7(2), pp. 58-70. https://doi.org/10.33094/7.2017.2020.72.58.70
[60]Muda, H., Baba, Z. S., Awang, Z., Badrul, N. S., Loganathan, N., & Ali, M. H. 2020. Expert Review and Pretesting of Behavioral Supervision in Higher Education. Journal of Applied Research in Higher Education, 12(4), pp. 767-785. https://doi.org/10.1108/JARHE-02-2019-0029
[61]Brugia, M., & Zukersteinova, A. 2019. Continuing Vocational Training in EU Enterprises. Publications Office of the European Union. https://doi.org/10, 704583.
[62]Costello, A. B., & Osborne, J. 2005. Best Practices in Exploratory Factor Analysis: Four Recommendations for Getting the Most from Your Analysis. Practical assessment, research, and evaluation, 10(1), p. 7. https://doi.org/10.7275/jyj1-4868
[63]Bentler, P. M., & Bonett, D. G. 1980. Significance tests and goodness of fit in the analysis of covariance structures. Psychological bulletin, 88(3), p. 588. https://doi.org/10.1037/0033-2909.88.3.588
[64]Hu, L. t., & Bentler, P. M. 1999. Cutoff criteria for fit indexes in covariance structure analysis: Conventional criteria versus new alternatives. Structural equation modeling: a multidisciplinary journal, 6(1), pp. 1-55. https://doi.org/10.1080/10705519909540118
[65]MacCallum, R. C., Browne, M. W., & Sugawara, H. M. 1996. Power analysis and determination of sample size for covariance structure modeling. Psychological methods, 1(2), p. 130. https://doi.org/10.1037/1082-989X.1.2.130
[66]Seyal, A. H., Rahman, M. N. A., & Rahim, M. M. (2002). Determinants of academic use of the Internet: a structural equation model. Behaviour & Information Technology, 21(1), pp. 71-86. https://doi.org/10.1080/01449290210123354
[67]Hair, J. F., Hult, G. T. M., Ringle, C. M., & Sarstedt, M. 2021. A primer on partial least squares structural equation modeling (PLS-SEM). Sage publications.
[68]Fornell, C., & Larcker, D. F. 1981. Evaluating structural equation models with unobservable variables and measurement error. Journal of marketing research, 18(1), pp. 39-50. https://doi.org/10.1177/002224378101800104
[69]Messiou, K. 2017. Research in the Field of Inclusive Education: Time for a Rethink? International journal ofinclusive education, 21(2), pp. 146-159. https://doi.org/10.1080/13603116.2016.1223184
[70]Wesnawa, I. G. A., Kartowagiran, B., Jaedun, A., Hamdi, S., Hadi, S., Susantini, E., Sunendar, D., Laliyo, L. A. R., Christiawan, P. I., & Gede, D. 2022. Content Validation of Digital Instrument for Measurement of Pedagogic Competence for Social Science Teacher Candidates in the Industrial Revolution 4.0 Era in Indonesia. International Journal of Information and Education Technology, 12(12). https://doi.org/10.18178/ijiet.2022.12.12.1767
[71]Zainol, M. S., Hamzah, M. I. M., & Alias, B. S. 2022. Validity and Reliability of Principal Authentic Leadership Instrument. Journal of Pharmaceutical Negative Results, pp. 4232-4248. https://doi.org/10.47750/pnr.2022.13.S07.530
[72]Jenson, J., Tomin, B., & Terzopoulos, T. 2018. Making Space for Innovation: Teacher Perspectives on a Wearable Technology Curriculum. ICERI2018 Proceedings. https://doi.org/10.21125/iceri.2018.1979.