Once confined within safety enclosures to prevent any human contact, industrial robots have gradually evolved into collaborative partners capable of working alongside people in shared spaces, where even physical interaction has become possible. Driven by the developments of collaborative robot (cobot) technology, these emerging human–robot partnerships demand new ways of coordinating and controlling physical interaction to achieve seamless and intuitive cooperation. This thesis investigates the mechanisms of collaboration and control that make such cooperation possible, focusing on how interaction strategies should adapt to the roles that emerge from the task context and environmental conditions. Achieving such adaptability, however, depends on the interplay of several interacting elements, which together constitute the multi-layered nature of physical Human–Robot Interaction (pHRI). As one of these key elements, the first part of the thesis focuses on the co-manipulated object dynamics, proposing an adaptive human-robot co-transportation framework that can handle objects with diverse physical characteristics. By combining haptic feedback transmitted through the object with human motion cues, the framework generates reactive whole-body movements for mobile collaborative robots, ensuring reliable performance across objects with varying deformability, mass distributions, and dimensions. Building upon this foundation, an obstacle-aware feedback system is introduced to extend the co-transportation framework, addressing the additional demands of the surrounding environment as another integral element shaping the interaction. While the previous approach focused on human-led cooperation in obstacle-free environments, the new framework enhances situational awareness by establishing bi-directional communication between the human and the robot. Through the integration of vibrotactile warning and virtual fixture mechanisms, the system not only informs the human partner about environmental constraints but also modulates the robot’s motion when required, thereby improving safety and coordination in settings where the object partially obstructs the user’s view of the surroundings. This thesis further extends adaptive coordination to scenarios in which visual perception is severely reduced or lost, where effective cooperation requires the robot to take the lead, introducing a novel robot-assisted guidance framework designed for visually impaired users. In this setting, the robot proactively navigates the human through complex and unstructured environments while avoiding both dynamic and static obstacles at different levels. By combining human state estimation with environmental perception, the system continuously evaluates the level of interaction risk and dynamically modulates the arm stiffness along each axis, as well as the velocity of the mobile base, to prevent collisions with any part of the human body and ensure comfortable physical guidance. Finally, this thesis presents an information-theoretic perspective to analyze and quantify the coordination mechanisms underlying human–robot interaction. A non-parametric approach based on information entropy is proposed to capture the directionality and magnitude of information flow between agents, allowing the identification of emerging roles and interaction patterns during collaboration. Although the approach is validated in a collaborative catching task, which is selected for its inherently high dynamism and demand for efficient collaboration, the adimensional and data-agnostic nature of the method ensures its applicability across diverse scenarios. Overall, this thesis offers comprehensive methodologies for achieving and analyzing role-adaptive coordination in pHRI, whose effectiveness in extensive user studies involving realistic cooperative tasks suggests applicability across manufacturing, logistics, and assistive domains.
Role-Adaptive Coordination and Control for Physical Human-Robot Cooperative Tasks
SIRINTUNA, DOGANAY
2026
Abstract
Once confined within safety enclosures to prevent any human contact, industrial robots have gradually evolved into collaborative partners capable of working alongside people in shared spaces, where even physical interaction has become possible. Driven by the developments of collaborative robot (cobot) technology, these emerging human–robot partnerships demand new ways of coordinating and controlling physical interaction to achieve seamless and intuitive cooperation. This thesis investigates the mechanisms of collaboration and control that make such cooperation possible, focusing on how interaction strategies should adapt to the roles that emerge from the task context and environmental conditions. Achieving such adaptability, however, depends on the interplay of several interacting elements, which together constitute the multi-layered nature of physical Human–Robot Interaction (pHRI). As one of these key elements, the first part of the thesis focuses on the co-manipulated object dynamics, proposing an adaptive human-robot co-transportation framework that can handle objects with diverse physical characteristics. By combining haptic feedback transmitted through the object with human motion cues, the framework generates reactive whole-body movements for mobile collaborative robots, ensuring reliable performance across objects with varying deformability, mass distributions, and dimensions. Building upon this foundation, an obstacle-aware feedback system is introduced to extend the co-transportation framework, addressing the additional demands of the surrounding environment as another integral element shaping the interaction. While the previous approach focused on human-led cooperation in obstacle-free environments, the new framework enhances situational awareness by establishing bi-directional communication between the human and the robot. Through the integration of vibrotactile warning and virtual fixture mechanisms, the system not only informs the human partner about environmental constraints but also modulates the robot’s motion when required, thereby improving safety and coordination in settings where the object partially obstructs the user’s view of the surroundings. This thesis further extends adaptive coordination to scenarios in which visual perception is severely reduced or lost, where effective cooperation requires the robot to take the lead, introducing a novel robot-assisted guidance framework designed for visually impaired users. In this setting, the robot proactively navigates the human through complex and unstructured environments while avoiding both dynamic and static obstacles at different levels. By combining human state estimation with environmental perception, the system continuously evaluates the level of interaction risk and dynamically modulates the arm stiffness along each axis, as well as the velocity of the mobile base, to prevent collisions with any part of the human body and ensure comfortable physical guidance. Finally, this thesis presents an information-theoretic perspective to analyze and quantify the coordination mechanisms underlying human–robot interaction. A non-parametric approach based on information entropy is proposed to capture the directionality and magnitude of information flow between agents, allowing the identification of emerging roles and interaction patterns during collaboration. Although the approach is validated in a collaborative catching task, which is selected for its inherently high dynamism and demand for efficient collaboration, the adimensional and data-agnostic nature of the method ensures its applicability across diverse scenarios. Overall, this thesis offers comprehensive methodologies for achieving and analyzing role-adaptive coordination in pHRI, whose effectiveness in extensive user studies involving realistic cooperative tasks suggests applicability across manufacturing, logistics, and assistive domains.| File | Dimensione | Formato | |
|---|---|---|---|
|
phdunige_5534762.pdf
embargo fino al 26/02/2027
Licenza:
Tutti i diritti riservati
Dimensione
24.98 MB
Formato
Adobe PDF
|
24.98 MB | Adobe PDF |
I documenti in UNITESI sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.
https://hdl.handle.net/20.500.14242/359726
URN:NBN:IT:UNIGE-359726