TapSense: Combining Self-Report Patterns and Typing Characteristics for Smartphone Based Emotion Detection
Document Type
Conference Proceeding
Publication Date
9-4-2017
Publication Title
Proceedings of the International Conference on Human-Computer Interaction with Mobile Devices and Services
DOI
10.1145/3098279.3098564
ISBN
978-1-4503-5075-4
Abstract
Typing based communication applications on smartphones, like WhatsApp, can induce emotional exchanges. The effects of an emotion in one session of communication can persist across sessions. In this work, we attempt automatic emotion detection by jointly modeling the typing characteristics, and the persistence of emotion. Typing characteristics, like speed, number of mistakes, special characters used, are inferred from typing sessions. Self reports recording emotion states after typing sessions capture persistence of emotion. We use this data to train a personalized machine learning model for multi-state emotion classification. We implemented an Android based smartphone application, called TapSense, that records typing related metadata, and uses a carefully designed Experience Sampling Method (ESM) to collect emotion self reports. We are able to classify four emotion states - happy, sad, stressed, and relaxed, with an average accuracy (AUCROC) of 84% for a group of 22 participants who installed and used TapSense for 3 weeks.
Recommended Citation
Ghosh, Surjya, Niloy Ganguly, Bivas Mitra, Pradipta De.
2017.
"TapSense: Combining Self-Report Patterns and Typing Characteristics for Smartphone Based Emotion Detection."
Proceedings of the International Conference on Human-Computer Interaction with Mobile Devices and Services Vienna, Austria: Association for Computing Machinery.
doi: 10.1145/3098279.3098564 isbn: 978-1-4503-5075-4
https://digitalcommons.georgiasouthern.edu/compsci-facpubs/90