Humans exhibit an extraordinary ability to adapt to changes and interact with both their environment and others despite living in an unpredictable and uncertain world. The ambitious goal of making robots as similar to humans as possible - so that they can assist us in our daily activities - has driven robotics research toward increasingly safe and efficient solutions. While hardware advancements have led to more sophisticated and versatile robots, software innovations have expanded their capabilities, enabling them to perform increasingly complex tasks. Despite significant progress, which has allowed robots to expand beyond the industrial sector into a wide range of applications, such as space exploration, healthcare, assistance, and education, the development of fully functional robots capable of integrating into our daily lives remains a distant goal due to the complexity and unpredictability of our world. To achieve full integration, robots must interact safely with both each other and humans, making decisions based on their perception of the environment - often dynamic and unpredictable - as well as on the actions of other agents, whether humans or robots. At the same time, humans should be able to interact with robots in an intuitive and straightforward way, customizing their behavior as needed. These challenges revolve around two key concepts: accessibility and functionality. Accessibility refers to simplifying robot programming to enable non-expert users - those without knowledge of robotics and programming - to interact with them. Functionality requires making robots capable of performing increasingly complex tasks while interacting with their surroundings and other robots. This thesis proposes innovative solutions to address these two challenges. The ultimate goal is to enable both expert and non-expert users to interact with robots in a more natural way while making them capable of performing complex tasks in various real-world scenarios, collaborating to achieve a common goal when necessary. First, two approaches to intuitive programming for mobile robots are presented. The first one allows users to teach paths to the robot via a joystick, enabling it to reproduce them even when initial conditions (i.e., its starting position and orientation) change or when obstacles appear along the route. The second approach, which builds on the findings and limitations of the first, allows the robot to progressively learn information about its environment as the user guides it along new paths, becoming capable of navigating within it reliably. Both approaches have been developed with a strong focus on the end-user, who may have no prior knowledge of robotics, programming, or navigation, to ensure they can still interact with such robots. Subsequently, Behavior Trees - a widely used technique in robotics and video games - are introduced as a tool for composing individual actions or skills into more complex behaviors. A formal analysis is conducted on their convergence when employed for multi-robot systems control, aiming to identify design criteria that ensure effective cooperation. Finally, intuitive programming approaches are integrated with Behavior Trees to allow heterogeneous robots to interact and collaborate autonomously. In this way, this thesis takes a step further toward the real-world integration of robots, making them more accessible and capable of interacting effectively with humans and other robots.
Enhancing Robot Accessibility and Functionality: from Intuitive Programming to Multi-Robot Collaboration
SARNO, VALERIA
2025
Abstract
Humans exhibit an extraordinary ability to adapt to changes and interact with both their environment and others despite living in an unpredictable and uncertain world. The ambitious goal of making robots as similar to humans as possible - so that they can assist us in our daily activities - has driven robotics research toward increasingly safe and efficient solutions. While hardware advancements have led to more sophisticated and versatile robots, software innovations have expanded their capabilities, enabling them to perform increasingly complex tasks. Despite significant progress, which has allowed robots to expand beyond the industrial sector into a wide range of applications, such as space exploration, healthcare, assistance, and education, the development of fully functional robots capable of integrating into our daily lives remains a distant goal due to the complexity and unpredictability of our world. To achieve full integration, robots must interact safely with both each other and humans, making decisions based on their perception of the environment - often dynamic and unpredictable - as well as on the actions of other agents, whether humans or robots. At the same time, humans should be able to interact with robots in an intuitive and straightforward way, customizing their behavior as needed. These challenges revolve around two key concepts: accessibility and functionality. Accessibility refers to simplifying robot programming to enable non-expert users - those without knowledge of robotics and programming - to interact with them. Functionality requires making robots capable of performing increasingly complex tasks while interacting with their surroundings and other robots. This thesis proposes innovative solutions to address these two challenges. The ultimate goal is to enable both expert and non-expert users to interact with robots in a more natural way while making them capable of performing complex tasks in various real-world scenarios, collaborating to achieve a common goal when necessary. First, two approaches to intuitive programming for mobile robots are presented. The first one allows users to teach paths to the robot via a joystick, enabling it to reproduce them even when initial conditions (i.e., its starting position and orientation) change or when obstacles appear along the route. The second approach, which builds on the findings and limitations of the first, allows the robot to progressively learn information about its environment as the user guides it along new paths, becoming capable of navigating within it reliably. Both approaches have been developed with a strong focus on the end-user, who may have no prior knowledge of robotics, programming, or navigation, to ensure they can still interact with such robots. Subsequently, Behavior Trees - a widely used technique in robotics and video games - are introduced as a tool for composing individual actions or skills into more complex behaviors. A formal analysis is conducted on their convergence when employed for multi-robot systems control, aiming to identify design criteria that ensure effective cooperation. Finally, intuitive programming approaches are integrated with Behavior Trees to allow heterogeneous robots to interact and collaborate autonomously. In this way, this thesis takes a step further toward the real-world integration of robots, making them more accessible and capable of interacting effectively with humans and other robots.File | Dimensione | Formato | |
---|---|---|---|
PhD_Thesis_Valeria_SARNO.pdf
embargo fino al 02/07/2028
Dimensione
26.03 MB
Formato
Adobe PDF
|
26.03 MB | Adobe PDF |
I documenti in UNITESI sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.
https://hdl.handle.net/20.500.14242/219611
URN:NBN:IT:UNIPI-219611