TapSense: Combining Self-Report Patterns and Typing Characteristics for Smartphone based Emotion Detection
Proceedings of the 19th International Conference on Human-Computer Interaction with Mobile Devices and Services
Typing based communication applications on smartphones, like WhatsApp, can induce emotional exchanges. The effects of an emotion in one session of communication can persist across sessions. In this work, we attempt automatic emotion detection by jointly modeling the typing characteristics, and the persistence of emotion. Typing characteristics, like speed, number of mistakes, special characters used, are inferred from typing sessions. Self reports recording emotion states after typing sessions capture persistence of emotion. We use this data to train a personalized machine learning model for multi-state emotion classification. We implemented an Android based smartphone application, called TapSense, that records typing related metadata, and uses a carefully designed Experience Sampling Method (ESM) to collect emotion self reports. We are able to classify four emotion states - happy, sad, stressed, and relaxed, with an average accuracy (AUCROC) of 84% for a group of 22 participants who installed and used TapSense for 3 weeks.
Ghosh, Surjya, Niloy Ganguly, Bivas Mitra, Pradipta De.
"TapSense: Combining Self-Report Patterns and Typing Characteristics for Smartphone based Emotion Detection."
Proceedings of the 19th International Conference on Human-Computer Interaction with Mobile Devices and Services Vienna, Austria: ACM.
doi: 10.1145/3098279.3098564 source: https://dl.acm.org/citation.cfm?id=3098564 isbn: 978-1-4503-5075-4