Faculty Mentor

Dr. Rocio Alba-Flores

Location

Poster 107

Session Format

Poster Presentation

Academic Unit

Allen E. Paulson College of Engineering and Computing

Keywords

Allen E. Paulson College of Engineering and Computing Student Research Symposium, Electroencephalographic, EEG, Artificial Neural Network, ANN

Creative Commons License

Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 License.

Presentation Type and Release Option

Presentation (Open Access)

Start Date

2022 12:00 AM

January 2022

Share

COinS
 
Jan 1st, 12:00 AM

Drone Control Using Electroencephalogram Signals

Poster 107

In this project, we present the development of a system that can control a drone using a headset sensor that detects electroencephalographic (EEG) waves from the drone’s pilot when he/she performs facial gestures. The drone is controlled using specific facial expressions which are recorded using a commercial EEG headband, the OpenBCI EEG headband (fig. 2). The EEG headband uses electrodes to read the electric potentials from the brain. The EEG signals were recorded and analyzed using the OpenBCI GUI software. The data files recorded from the EEG headband were exported to Matlab to perform the signal conditioning, feature extraction, and design and training of the Artificial Neural Network (ANN) that was used to classify the facial gestures. For each data recording, three statistical values were computed: the standard deviation, root mean squared and mode. These values were used as the features for each facial gesture. The feature extraction data were used as the inputs to the ANN. The input to train an ANN consisted of a 9x45 array generated from the pilot performing fifteen recordings of each facial gesture. The target matrix was a 3x45 size, this is 3 classes and 45 recordings. The Neural Net Pattern Recognition tool from Matlab was used for the implementation of the ANN. After the ANN was trained to classify the 3 facial gestures, the output of the ANN was used to control the drone. The drone used in this project was a palm sized DJI Tello drone. Three facial gestures were selected to control the motion of the drone as follows: raising eyebrows, hard blinking and looking right. Results of the ANN training yielded a 97% accuracy in the classification of the facial gestures.