Term of Award
Master of Science, Computer Science (M.S.C.S.)
Document Type and Release Option
Thesis (open access)
Copyright Statement / License for Reuse
This work is licensed under a Creative Commons Attribution 4.0 License.
Department of Computer Science
Committee Member 1
Committee Member 2
Academic institutions and instructors lack the ability to accurately assess the moment-to-moment attentiveness of students in classrooms where students’ faces are obscured by computer monitors. This can cause the lectures of Computer Science, Information Technology, or other lab-based courses to be incorrectly-paced, which can lead to students having an overall poorer grasp of the subject material. We propose a system for real-time accurate detection of classroom attentiveness using monitor-mounted webcams and eye trackers, along with a Convolutional Neural Network machine learning model (NiCATS). Through the use of a neural network, we produce an initial attentiveness score based on student webcam images which are compared to a series of extracted eye metrics and used to identify correlations for an automatic attentiveness judging system. Because student perceived attentiveness alone does not provide feedback regarding the knowledge understanding of the content being presented, this thesis explores the application of NiCATS to 1) understand student knowledge acquisition and 2) gain deeper insights into the content topics that students struggle with (e.g., muddiest points) in the context of gaze metrics. It is hoped that the results from this work will help instructors utilize NiCATS to understand student learning behavior in their classrooms. Interested researchers can also design interventions that can be evaluated for improving student learning through automated collection and analysis of gaze metrics and face images (as part of the NiCATS system).
Boswell, Bradley, "Using AI-Enabled Gaze Tracking System to Understand Comprehension Patterns of Computer Science Students" (2022). Electronic Theses and Dissertations. 2441.
Research Data and Supplementary Material