Learner Attention Quantification using Eye Tracking and EEG Signals

Faculty Mentor

Dr. Andrew Allen, Dr. Felix Hamza-Lup

Location

Poster 218

Session Format

Poster Presentation

Academic Unit

Allen E. Paulson College of Engineering and Computing

Keywords

Allen E. Paulson College of Engineering and Computing Student Research Symposium, Electro- Encephalograms, EEG, Brain Computer Interface, BCI

Creative Commons License

Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 License.

Presentation Type and Release Option

Presentation (File Not Available for Download)

Start Date

2022 12:00 AM

January 2022

This document is currently not available here.

Share

COinS
 
Jan 1st, 12:00 AM

Learner Attention Quantification using Eye Tracking and EEG Signals

Poster 218

Technology enabled tools that provide feedback on attention and focus can provide valuable insights and greater understanding of what pedagogical approaches works best for students[4,5]. This research is focused on understanding how student attentiveness and knowledge acquisition can be measured using eye-tracking (e.g., gaze points) and Electro- Encephalograms (EEG) signals data. By leveraging the existing NiCATS (Non-Intrusive Classroom Attention Tracking System) setup, we have integrated the Brain Computer Interface (BCI) devices (e.g., NeuroSky Mindwave) with NiCATS. NiCATS aims to provide instructors with real-time feedback on the attentiveness of students in their classroom [1]. NeuroSky BCI device records EEG power spectrums (alpha, beta, gamma etc) and special algorithms interpret these spectrums. NiCATS collects webcam images, eye gaze points, eye movements, and screenshots for quantifying student attentiveness. We merged the EEG signals data with the NiCATS pipeline and explored the possibilities of quantifying the Eye tracking data and EEG Signals which will eventually result in identifying any relationship of EEG Signals with attentiveness. This research explores the correlation between Eye Metrics (Table-1) determined from the Eye Gaze data and the EEG signals (ThinkGear SDK interpretation of ESense, EEGBand) generated by the Neurosky MindSet Device [2]. The results of this analysis will pave the way for exploring different aspects of students' attentiveness and comprehension (e.g., facial images, eye movements, EEG signals) to understand how these inputs can be used to train a machine learning model for quantifying student attentiveness.