Among humans, interactions are not always guaranteed to be pleasant despite humans being social agents. Sometimes we enjoy being involved and sometimes we disapprove the situation and desire to withdraw from it. Also, in spite feelings are quite often masked, we as humans are usually good reading this kind of situations. To some extent, we can intuit how comfortable our interaction partner is and discover whether we have acted appropriately to their own standards. In the same fashion, we believe that the ability of perceiving others’ “attitude towards the interaction” is what makes relationships successful, and thus, the key to help any interactive agent to become part of society. For this reason, this thesis tackles this aspect under a Human-Robot Interaction (HRI) perspective. First, it introduces and explores the aforementioned internal state, naming it Comfortability. It provides a Comfortability definition, testing whether it is comprehensible by others, meaningful for HRI and different from other emotional and affective states. Second, it explores whether a humanoid robot is capable of affecting people’s Comfortability by creating a challenging real interview between the humanoid robot iCub and selected researchers from our institution. To study how people behave under such circumstances, the interviewees’ personality traits, attitude toward robots, self-reports and facial expressions have been deeply analyzed. On top of that and in order to discover which non-verbal cues are associated with negative Comfortability levels, the videos recorded during the interviews have been trimmed into three-second clips and annotated using a Comfortability level between 1 (i.e., being Extremely Uncomfortable) and 7 (i.e., being Extremely Comfortable). As a result, a list of 30 facial and corporal movements have been identified as being informative about expressing low Comfortability. And last, several algorithms based on Feature-based Learning have been developed. The one considering a combination of several facial and upper-body temporal dynamics has proven to be the best Comfortability classifier obtaining a recognition accuracy of 76%. These findings have paved the path for further projects that would contribute to an artificial social intelligence, which might help future robots integrate themselves better in genuine social contexts.

Comfortability Definition, Elicitation, Analysis and Recognition under a Human-Robot Interaction Perspective

LECHUGA REDONDO, MARIA ELENA
2022

Abstract

Among humans, interactions are not always guaranteed to be pleasant despite humans being social agents. Sometimes we enjoy being involved and sometimes we disapprove the situation and desire to withdraw from it. Also, in spite feelings are quite often masked, we as humans are usually good reading this kind of situations. To some extent, we can intuit how comfortable our interaction partner is and discover whether we have acted appropriately to their own standards. In the same fashion, we believe that the ability of perceiving others’ “attitude towards the interaction” is what makes relationships successful, and thus, the key to help any interactive agent to become part of society. For this reason, this thesis tackles this aspect under a Human-Robot Interaction (HRI) perspective. First, it introduces and explores the aforementioned internal state, naming it Comfortability. It provides a Comfortability definition, testing whether it is comprehensible by others, meaningful for HRI and different from other emotional and affective states. Second, it explores whether a humanoid robot is capable of affecting people’s Comfortability by creating a challenging real interview between the humanoid robot iCub and selected researchers from our institution. To study how people behave under such circumstances, the interviewees’ personality traits, attitude toward robots, self-reports and facial expressions have been deeply analyzed. On top of that and in order to discover which non-verbal cues are associated with negative Comfortability levels, the videos recorded during the interviews have been trimmed into three-second clips and annotated using a Comfortability level between 1 (i.e., being Extremely Uncomfortable) and 7 (i.e., being Extremely Comfortable). As a result, a list of 30 facial and corporal movements have been identified as being informative about expressing low Comfortability. And last, several algorithms based on Feature-based Learning have been developed. The one considering a combination of several facial and upper-body temporal dynamics has proven to be the best Comfortability classifier obtaining a recognition accuracy of 76%. These findings have paved the path for further projects that would contribute to an artificial social intelligence, which might help future robots integrate themselves better in genuine social contexts.
13-lug-2022
Inglese
REA, FRANCESCO
SCIUTTI, ALESSANDRA
Università degli studi di Genova
File in questo prodotto:
File Dimensione Formato  
phdunige_4619123.pdf

Open Access dal 14/07/2023

Dimensione 5.49 MB
Formato Adobe PDF
5.49 MB Adobe PDF Visualizza/Apri

I documenti in UNITESI sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/20.500.14242/170900
Il codice NBN di questa tesi è URN:NBN:IT:UNIGE-170900