Multimodal Emotion Recognition among Couples from Lab Settings to Daily Life using Smartwatches
Open access
Author
Date
2022Type
- Doctoral Thesis
ETH Bibliography
yes
Altmetrics
Abstract
Couples generally manage chronic diseases together and the management takes
an emotional toll on both patients and their romantic partners. Consequently,
recognizing the emotions of each partner in daily life could provide an insight
into their emotional well-being in chronic disease management. The emotions of
partners are currently inferred in the lab and daily life using self-reports which
are not practical for continuous emotion assessment or observer reports which
are manual, time-intensive, and costly. Currently, there exists no comprehen sive overview of works on emotion recognition among couples. Furthermore,
approaches for emotion recognition among couples have (1) focused on English speaking couples in the U.S., (2) used data collected from the lab, and (3)
performed recognition using observer ratings rather than partner’s self-reported
/ subjective emotions.
In this body of work contained in this thesis (8 papers — 5 published and
3 currently under review in various journals), we fill the current literature gap
on couples’ emotion recognition, develop emotion recognition systems using 161
hours of data from a total of 1,051 individuals, and make contributions towards
taking couples’ emotion recognition from the lab which is the status quo, to
daily life. First, we provided a comprehensive survey of the research field of
emotion recognition among couples (Paper 1). Second, we leveraged insights
from psychology research and deep transfer learning approaches to develop ma chine learning systems to recognize each partner’s emotions using lab data from
Dutch-speaking couples in Belgium (Paper 2) and German-speaking couples in
Switzerland (Papers 3 and 4). We also performed emotion recognition using
data from German-speaking elderly individuals (not romantic partners) in Ger many (Paper 5) given the target use case for our emotion recognition system
consisted of partners who were elderly and spoke German. Third, we developed
ubiquitous smartwatch and smartphone systems — VADLite and DyMand —
to collect relevant multimodal sensor data and self-report emotion data from the
daily life interactions of German-speaking, Swiss-based couples managing type 2
diabetes (Papers 6 and 7). Finally, we developed and evaluated machine learning
systems for recognizing each partner’s emotions using the collected multimodal
real-world smartwatch data — heart rate, accelerometer, gyroscope, and speech
(Paper 8).
This thesis contributes toward building automated emotion recognition sys tems that would eventually enable partners to monitor their emotions in daily
life and enable the delivery of interventions to improve their emotional well being. Show more
Permanent link
https://doi.org/10.3929/ethz-b-000588706Publication status
publishedExternal links
Search print copy at ETH Library
Publisher
ETH ZurichSubject
Emotion Recognition; Chronic Disease Management; Affective Computing; Wearable Computing; Mobile sensing; Applied machine learningOrganisational unit
03681 - Fleisch, Elgar / Fleisch, Elgar
More
Show all metadata
ETH Bibliography
yes
Altmetrics