Predicting Students' Attention Level with Interpretable Facial and Head Dynamic Features in an Online Tutoring System (Student Abstract)

Authors

  • Shimeng Peng Nagoya University
  • Lujie Chen Carnegie Mellon University
  • Chufan Gao Carnegie Mellon University
  • Richard Jiarui Tong Squirrel AI Learning

DOI:

https://doi.org/10.1609/aaai.v34i10.7220

Abstract

Engaged learners are effective learners. Even though it is widely recognized that engagement plays a vital role in learning effectiveness, engagement remains to be an elusive psychological construct that is yet to find a consensus definition and reliable measurement. In this study, we attempted to discover the plausible operational definitions of engagement within an online learning context. We achieved this goal by first deriving a set of interpretable features on dynamics of eyes, head and mouth movement from facial landmarks extractions of video recording when students interacting with an online tutoring system. We then assessed their predicative value for engagement which was approximated by synchronized measurements from commercial EEG brainwave headset worn by students. Our preliminary results show that those features reduce root mean-squared error by 29% compared with default predictor and we found that the random forest model performs better than a linear regressor.

Downloads

Published

2020-04-03

How to Cite

Peng, S., Chen, L., Gao, C., & Tong, R. J. (2020). Predicting Students’ Attention Level with Interpretable Facial and Head Dynamic Features in an Online Tutoring System (Student Abstract). Proceedings of the AAAI Conference on Artificial Intelligence, 34(10), 13895-13896. https://doi.org/10.1609/aaai.v34i10.7220

Issue

Section

Student Abstract Track