Integrating social robots into everyday life requires a deep understanding of how these agents integrate into group interactions. While previous studies primarily focus on dyadic interactions, this thesis explores group dynamics in Human-Robot Interaction, focusing on the influence patterns that robots can exert within human-robot groups. By analyzing participants' behavioral responses to robotic actions, this work investigates both the positive and negative effects of robot influence, from fostering cooperation to the potential risk of social exclusion. First, the research examines the influence of social robots on team choices. Studies with children demonstrate that they rarely rely on the robots' atypical behavior or advice that has been proven to be incorrect. Instead, they tend to follow the lead of proactive group members, highlighting the importance of initiative as a leadership trait. These findings have implications for the design of robotic tutors that could leverage the presence of proactive players to engage the whole group. Additionally, experiments with adults reveal that, although participants do not easily conform to robot-majority groups, increased response times might indicate hesitation when faced with robotic disagreement, suggesting a form of social influence that could be relevant in safety contexts. Second, the thesis investigates whether robotic influence can be leveraged to promote cooperative behaviors. Using an experimental paradigm based on the Public Good Game, the findings suggest that cooperative robot behavior alone is not sufficient to increase human cooperation. Future research is needed to explore more nuanced strategies, such as the integration of robot's social cues. Understanding these dynamics could have broad applications, such as improving teamwork in educational and work settings. Finally, the effects of social exclusion in human-robot groups are explored, demonstrating that robots can induce feelings of ostracism similar to human excluders. Notably, the exclusion by a robot is perceived as more threatening than the exclusion from a human. Additionally, excluded individuals with prior exposure to the robot are more likely to apply social strategies similar to those used in human relations to their interaction with the robot. These findings highlight the need for the development of inclusive robotic cognitive architectures that recognize social exclusion and foster balanced group interactions. By advancing our understanding of group interactions with robots, this research contributes to the broader common goal of developing social robots that understand humans and adopt behaviors that enable humans to interact effortlessly with them.
Group Dynamics in Human-Robot Interaction: Influence Patterns and Behavioral Responses
PUSCEDDU, GIULIA
2025
Abstract
Integrating social robots into everyday life requires a deep understanding of how these agents integrate into group interactions. While previous studies primarily focus on dyadic interactions, this thesis explores group dynamics in Human-Robot Interaction, focusing on the influence patterns that robots can exert within human-robot groups. By analyzing participants' behavioral responses to robotic actions, this work investigates both the positive and negative effects of robot influence, from fostering cooperation to the potential risk of social exclusion. First, the research examines the influence of social robots on team choices. Studies with children demonstrate that they rarely rely on the robots' atypical behavior or advice that has been proven to be incorrect. Instead, they tend to follow the lead of proactive group members, highlighting the importance of initiative as a leadership trait. These findings have implications for the design of robotic tutors that could leverage the presence of proactive players to engage the whole group. Additionally, experiments with adults reveal that, although participants do not easily conform to robot-majority groups, increased response times might indicate hesitation when faced with robotic disagreement, suggesting a form of social influence that could be relevant in safety contexts. Second, the thesis investigates whether robotic influence can be leveraged to promote cooperative behaviors. Using an experimental paradigm based on the Public Good Game, the findings suggest that cooperative robot behavior alone is not sufficient to increase human cooperation. Future research is needed to explore more nuanced strategies, such as the integration of robot's social cues. Understanding these dynamics could have broad applications, such as improving teamwork in educational and work settings. Finally, the effects of social exclusion in human-robot groups are explored, demonstrating that robots can induce feelings of ostracism similar to human excluders. Notably, the exclusion by a robot is perceived as more threatening than the exclusion from a human. Additionally, excluded individuals with prior exposure to the robot are more likely to apply social strategies similar to those used in human relations to their interaction with the robot. These findings highlight the need for the development of inclusive robotic cognitive architectures that recognize social exclusion and foster balanced group interactions. By advancing our understanding of group interactions with robots, this research contributes to the broader common goal of developing social robots that understand humans and adopt behaviors that enable humans to interact effortlessly with them.File | Dimensione | Formato | |
---|---|---|---|
phdunige_4229393_1.pdf
embargo fino al 23/05/2026
Dimensione
18.72 MB
Formato
Adobe PDF
|
18.72 MB | Adobe PDF | |
phdunige_4229393_2.pdf
embargo fino al 23/05/2026
Dimensione
5.71 MB
Formato
Adobe PDF
|
5.71 MB | Adobe PDF |
I documenti in UNITESI sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.
https://hdl.handle.net/20.500.14242/211113
URN:NBN:IT:UNIGE-211113