Predicting Online Student Effort with Accelerometer, Heart Rate Sensors, and Camera Using Random Forest Regression Model

Full Text (PDF, 616KB), PP.10-23

Views: 0 Downloads: 0

Author(s)

Fumiko Harada 1,* Rin Nagai 2 Hiromitsu Shimakawa 3

1. Research organization of Science and Technology, Ritsumeikan University, Japan

2. College of Information Science and Engineering, Ritsumeikan University, Japan

3. Graduate School of Information Science and Engineering, Ritsumeikan University, Japan

* Corresponding author.

DOI: https://doi.org/10.5815/ijmecs.2022.05.02

Received: 15 Jun. 2022 / Revised: 1 Jul. 2022 / Accepted: 11 Jul. 2022 / Published: 8 Oct. 2022

Index Terms

Online education, student effort, acceleration sensor, heart rate sensor, camera, Machine learning, Random Forest regression model

Abstract

In online education through web conference tools, teachers cannot grasp students' states by watching their behaviors like in an offline classroom. Each student also cannot be affected by others' good behavior. This paper proposes a prediction method of the student effort through acceleration sensors and a heart rate sensor worn on a student's body, and a local camera. The effort is expressed by the levels of concentration, excitation, and bodily action. A Random Forest regression model is used to predict each level from the sensor and camera data. Exhibiting the prediction result brings visibility of student states like offline. We verified the effectiveness of the prediction model through an experiment. We built the Random Forest regression prediction models from the sensors, camera, and student effort data obtained by actual lectures. In the case of building one prediction model for one lecture/one subject, the average R2 values were 0.953, 0.925, and 0.930 in the concentration, excitation, and bodily action, respectively. The R2 was -0.835 when one prediction model trained by one lecture's data is applied for another lecture's prediction. That was 0.285 when one model by 4 subjects' data is applied for prediction for the rest 1 subject. It means that the prediction model has high accuracy but is dependent on individual persons and lectures, which forces a burden to individual student to collect initial training data for individual lecture to build a prediction model. We also found that the acceleration data are the most important features. It implies the effectiveness of using acceleration sensors to predict student effort.

Cite This Paper

Fumiko Harada, Rin Nagai, Hiromitsu Shimakawa, "Predicting Online Student Effort with Accelerometer, Heart Rate Sensors, and Camera Using Random Forest Regression Model", International Journal of Modern Education and Computer Science(IJMECS), Vol.14, No.5, pp. 10-23, 2022. DOI:10.5815/ijmecs.2022.05.02

Reference

[1]UNESCO, UNICEF, and World Bank. What’s next? lessons on education recovery: Findings from a survey of ministries of education amid the COVID-19 pandemic. OECD Publishing, june 2021.
[2]S. T. Saeed. Higher education and quality assurance in egypt: Pre and post COVID19. Higher Education, 8(2):Article 8, june 2021.
[3]P. Spathis and R. Dey. Online teaching amid COVID-19: The case of Zoom. In Proc. of IEEE Global Engineering Education Conference, pages 1398–1406, April 2021.
[4]N. Mu’awanah, Sumardi, and Suparno. Strengths and challenges in using zoom to support english learning during covid-19 pandemic. Jurnal Ilmiah Sekolah Dasar, 5(2):222–230, 2021.
[5]P. R. Lowenthal and R. E. West. Thinking beyond Zoom: Using asynchronous video to maintain connection and engagement during the COVID-19 pandemic. Journal of Technology and Teacher Education, 28(2):383–391, 2020.
[6]G. Zoric, K. Smid, and I. S. Pandzic. Facial gestures: taxonomy and application of non-verbal, non-emotional facial displays for embodied conversational agents. In T. Nishida, editor, Conversational informatics: An engineering approach, Wiley Series in Agent Technology, pages 161–182. Wiley, 2007.
[7]C. O’Keefe, L. H. Xu, and D. Clarke. Kikan-shido: Between desks instruction. In D. Clarke, J. Emanuelsson, E. Jablonka, and I. A. C. Mok, editors, Making connections: Comparing mathematics classrooms around the world, pages 73– 105. Brill, 2006.
[8]T. L. Chartrand and J. A. Bargh. The chameleon effect: the perception behavior link and social interaction. Journal of personality and social psychology, 76(6):893–910, 1999.
[9]M. C. Engels, K. Phalet, M. C. Gremmen, J. K. Dijkstra, and K. Verschueren. Adolescents’ engagement trajectories in multicultural classrooms: The role of the classroom context. Journal of Applied Developmental Psychology, 69:101156, 2020.
[10]F. R. Castelli and M. A. Sarvary. Why students do not turn on their video cameras during online classes and an equitable and inclusive plan to encourage them to do so. Ecology and Evolution, 11(8):3565–3576, 2021.
[11]S. Marsland. Machine learning: An algorithmic perspective. CRC Press, second edition edition, 2015.
[12]W. L. Romine, N. L. Schroeder, J. Graft, F. Yang, R. Sadeghi, M. Zabihimayvan, and T. Banerjee. Using machine learning to train a wearable device for measuring students’ cognitive load during problem-solving activities based on electrodermal activity, body temperature, and heart rate: Development of a cognitive load tracker for both personal and classroom use. Sensors, 20(17):4833, 2020.
[13]Y. C. Kuo, H. C. Chu, and M. C. Tsai. Effects of an integrated physiological signal-based attention-promoting and english listening system on students’ learning performance and behavioral patterns. Computers in Human Behavior, 75:218–227, 2017.
[14]T. Robal, Y. Zhao, C. Lofi, and C. Hauff. Webcam-based attention tracking in online learning: a feasibility study. In Proc. of the International Conference on Intelligent User Interfaces, pages 189–197, 2018.
[15]N. Blanchard, R. Bixler, T. Joyce, and S. D’Mello. Automated PhysiologicalBased Detection of Mind Wandering during Learning, pages 55–60. Springer International Publishing, Cham, 2014.
[16]S. Aslan, N. Alyuz, C. Tanriover, S. E. Mete, E. Okur, S. K. D’Mello, and A. Arslan Esme. Investigating the impact of a real-time, multimodal student engagement analytics technology in authentic classrooms. In Proc. of the CHI Conference on Human Factors in Computing Systems, pages 1–12, 2019.
[17]P. Pham and J. Wang. Attentivelearner: Improving mobile mooc learning via implicit heart rate tracking. In C. Conati, N. Heffernan, A. Mitrovic, and Verdejo M., editors, Artificial Intelligence in Education, volume 9112 of Lecture Notes in Computer Science. Springer, Cham, 2015.
[18]Z. Zhang, Z. Li, H. Liu, T. Cao, and S Liu. Data-driven online learning engagement detection via facial expression and mouse behavior recognition technology. Journal of Educational Computing Research, 58(1):63–86, 2020.
[19]W. Sun, Y. Li, F. Tian, X. Fan, and H. Wang. How presenters perceive and react to audience flow prediction in-situ: An explorative study of live online lectures. Proc. of ACM Human computer interaction, 3(CSCW), November 2019.
[20]N. Bosch, S. K. D’Mello, R. S. Baker, J. Ocumpaugh, V. Shute, M. Ventura, and W. Zhao. Detecting student emotions in computer-enabled classrooms. In Proc. of the International Joint Conference on Artificial Intelligence, pages 4125–4129, jan 2016.
[21]A. Revadekar, S. Oak, A. Gadekar, and P. Bide. Gauging attention of students in an e-learning environment. In Proc. of the IEEE Conference on Information and Communication Technology, pages 1–6, December 2020.
[22]H. Monkaresi, N. Bosch, R. A. Calvo, and S. K. D’Mello. Automated detection of engagement using video-based estimation of facial expressions and heart rate. IEEE Transactions on Affective Computing, 8(1):15–28, 2017.
[23]Y. Wang, A. Kotha, P. H. Hong, and M. Qiu. Automated student engagement monitoring and evaluation during learning in the wild. In Proc. of the IEEE International Conference on Cyber Security and Cloud Computing/the IEEE International Conference on Edge Computing and Scalable Cloud, pages 270–275, 2020.
[24]I. Haq and J. C. ZHANG. Engagement estimation for intelligent tutoring system in e-learning environment. DEStech Transactions on Social Science, Education and Human Science, 2019.
[25]Z. Zhang, Z. Li, H. Liu, T. Cao, and S. Liu. Data-driven online learning engagement detection via facial expression and mouse behavior recognition technology. Journal of Educational Computing Research, 58(1):63–86, 2020.
[26]A. F. Botelho, R. S. Baker, and N. T Heffernan. Improving sensor-free affect detection using deep learning. In Proc. of International conference on artificial intelligence in education, pages 40–51, June 2017.
[27]A. L. Morrison, S. Rozak, A. U. Gold, and J. E. Kay. Quantifying student engagement in learning about climate change using galvanic hand sensors in a controlled educational setting. Climatic Change, 159(1):17–36, 2020.
[28]K. Altuwairqi, S. K. Jarraya, A. Allinjawi, and Mohamed Hammani. Student behavior analysis to measure engagement levels in online learning environments. Signal, Image and Video Processing, 15:1387–1395, 2021.
[29]M. A. A. Dewan, F. Lin, D.Wen, M. Murshed, and Z. Uddin. A deep learning approach to detecting engagement of online learners. In Proc. of the IEEE SmartWorld, Ubiquitous Intelligence & Computing, Advanced & Trusted Computing, Scalable Computing & Communications, Cloud & Big Data Computing, Internet of People and Smart City Innovation, pages 1895–1902, 2018.
[30]K.S.McNeal, M.Zhong, N.A.Soltis, L.Doukopoulos, E.T.Johnson, S.Courtney, A. Alwan, and M. Porch. Biosensors show promise as a measure of student engagement in a large introductory biology course. CBE Life Sciences Education, 19(4), 2020.
[31]S. H. Fairclough and L. Venables. Prediction of subjective states from psychophysiology: A multivariate approach. Biological psychology, 71(1):100–110, 2006.
[32]TWILITE2525a.https://mono-wireless.com/jp/products/TWE-Lite-2525A/index.html. (In Japanese, accessed on Aug.2021).
[33]P. J. Lang. The emotion probe: Studies of motivation and attention. American Psychologist, 50(5):372–385, 1995.
[34]I. Reijmerink, M. van der Laan, and F. Cnossen. Heart rate variability as a measure of mental stress in surgery: a systematic review. 93(7):805–821, 2020.
[35]S. Z. Li and J. Wu. Face detection. In S. Z. Li and A. K. Jain, editors, Handbook of face recognition, pages 277–303. Springer, 2011.
[36]P. Viola and M. Jones. Rapid object detection using a boosted cascade of simple features. In Proc. of IEEE Society Conference on Computer Vision and Pattern Recognition, pages 511–518, 2001.
[37]Polar H10. https://www.polar.com/us-en/products/accessories/h10\ _heart\_rate\_sensor. (Accessed on Aug. 2021).
[38]Y. Okui, F. Harada, H. Takada, and H. Shimakawa. Improvement of lecture and materials based on response from students. IPSJ Journal, 50(1):361–371, 2009. (In Japanese).
[39]T. Hastie, R. Tibshirani, and J. Friedman. The elements of statistical learning: Data mining, inference, and prediction. Springer, second edition edition, 2009.
[40]J.B.Madwed, P.Albrecht, R.G.Mark, andR.J.Cohen. Low-frequencyoscillation in arterial pressure and heart rate: A simple computer model. American Journal Physiology Heart and Circulatory Physiology, 256(6):H1573, 1989.
[41]C. Koppe. Towards a pattern language for lecture design: An inventory and categorization of existing lecture-relevant patterns. In Proc. of the European Conference on Pattern Languages of ProgramJuly, pages pp.1–17, 2015. Article No.3.
[42]M. Rahimi and F. Asadollahi. Teaching styles of iranian efl teachers: Do gender, age, and experience make a difference? International Journal of English Linguistics, 2:157, 2012.