Methodology for Studying Emotional, Conversational, and Linguistic Facial Expressions
With the inception of affective computing in the late1990s, a new research field emerged to bring human affect recognition and generation into human-machine interaction. Although various types of affect have been studied using different modalities, the majority of the work over the last 20 years has focused on recognizing the basic emotions defined by Paul Ekman. Nevertheless, to achieve the goal of creating emotionally intelligent dialogue systems that can have engaging conversations, the focus needs to move away from basic emotions and towards more fine-grained expressions. As human communication is rich in different facial expressions, obtaining large datasets for one specific expression is time-consuming and costly. In this work, I present a data bootstrapping methodology that combines automatic methods and human annotation to reduce the cost and time of creating annotated data. This methodology allows to gain insights into emotional, conversational, and linguistic facial expressions, which I show in the following domains: stress, enthusiasm, and adjectives in sign language. In all cases, facial action units from the Facial Action Coding System were automatically detected and statistically evaluated. In addition, characteristic facial movements of each expression were quantitatively evaluated through user studies.
We hope that other researchers will follow up on exploring the expressions presented in this work as well as unstudied ones using our data bootstrapping methodology, as they can have a meaningful impact on the use of pedagogical agents, sign language translating agents, and rapport-building with agents.
History
Date
2024-12-18Degree Type
- Dissertation
Department
- Language Technologies Institute
Degree Name
- Doctor of Philosophy (PhD)