Quadruped robots are increasingly important in fields such as automation, inspection, and monitoring due to their unique capability to navigate complex and unstructured environments. Unlike wheeled robots, legged robots can overcome various obstacles, making them highly versatile for real-world applications. However, this versatility comes with challenges. Robust performance in such environments depends heavily on accurate state estimation, which provides essential information on the robot’s position, orientation, and velocity. State estimation is inherently more complex in legged robots due to their dynamic gait and the constantly changing points of contact with the ground. The integration of leg kinematics data can play a crucial role in improving state estimation accuracy, as it supplies additional information that complements data from other sensors. My research aims to advance state estimation capabilities in quadruped robots, enabling them to autonomously recognize, interpret, and adapt to their surroundings, which is vital for maintaining balance, perceiving obstacles, and executing complex maneuvers in dynamic and unpredictable terrains. To achieve this, I concentrated on developing state estimation algorithms that integrate real-time sensor data, leveraging multiple sensor streams for enhanced precision and reliability. This integration is critical to achieving dynamic motion control and autonomous operation, allowing robots to make immediate adjustments to their gait and trajectory in response to environmental changes. Additionally, I explored the use of Lie groups to design state estimation frameworks that employ both filtering and smoothing techniques. The filtering approach offers real-time responsiveness by using current sensor data to update the robot’s state, while the smoothing approach optimizes over a set of past states for improved accuracy. The primary outcomes of my research are embodied in three major contributions. The first is MUSE, a MUlti-sensor State Estimator designed specifically for quadruped robots. MUSE integrates data from a range of sensors, including Inertial Measurement Units (IMUs), encoders, force/torque sensors, cameras, and LiDARs, to provide accurate, reliable, and real-time state estimation even in challenging real-world environments with uneven or slippery surfaces. MUSE incorporates a dedicated slip detection module, that enables the robot to detect slippery terrain and correct the estimate by discarding the unreliable leg odometry measurements. Additionally, MUSE is built to be modular and flexible, allowing it to interface with various robot platforms and sensor configurations, making it adaptable to different applications and terrains. The real-time capabilities were demonstrated in real-time operations, where MUSE provided online feedback to the locomotion controller. Furthermore, MUSE is developed as an open-source tool, encouraging other researchers and engineers to use, modify, and build upon this work to further advance the field. The second major contribution is the development of frameworks for invariant state estimation, specifically a multi-sensor Invariant Extended Kalman Filter (InEKF) and an Invariant Smoother (IS). Both frameworks utilize Lie groups to incorporate leg kinematics, LiDAR positional data, and GPS coordinates (when available) to refine the robot’s state estimate and determine its global position. These frameworks are among the first to successfully integrate both proprioceptive (internal) and exteroceptive (external) measurements for invariant state estimation in legged robots, representing a significant innovation in the field. The third and final major contribution is the extensive testing of the proposed algorithms on multiple robots of varying sizes, ranging from the 21 kg Unitree Aliengo to the 90 kg HyQ robot from the Italian Institute of Technology. Looking ahead, future research will focus on enhancing individual components of these frameworks to improve overall estimation performance. For instance, advanced terrain estimation will play a key role in achieving fully autonomous operations. Estimating parameters such as the friction coefficient, as well as the geometrical and physical properties of the terrain (e.g. inclination and softness), will enable the robot to make real-time adjustments based on the terrain it encounters. Additionally, developing increasingly reliable mapping techniques will support long-term autonomy by allowing the robot to build a comprehensive understanding of its environment, reducing its reliance on external guidance or teleoperation. These advancements are expected to push the boundaries of autonomy in legged robots, allowing them to navigate complex environments independently, maintain stability on challenging terrain, and perform sophisticated tasks with minimal human intervention.
Enhancing State Estimation in Quadruped Robots: Multi-Sensor Fusion for Challenging Terrains
NISTICO', YLENIA
2025
Abstract
Quadruped robots are increasingly important in fields such as automation, inspection, and monitoring due to their unique capability to navigate complex and unstructured environments. Unlike wheeled robots, legged robots can overcome various obstacles, making them highly versatile for real-world applications. However, this versatility comes with challenges. Robust performance in such environments depends heavily on accurate state estimation, which provides essential information on the robot’s position, orientation, and velocity. State estimation is inherently more complex in legged robots due to their dynamic gait and the constantly changing points of contact with the ground. The integration of leg kinematics data can play a crucial role in improving state estimation accuracy, as it supplies additional information that complements data from other sensors. My research aims to advance state estimation capabilities in quadruped robots, enabling them to autonomously recognize, interpret, and adapt to their surroundings, which is vital for maintaining balance, perceiving obstacles, and executing complex maneuvers in dynamic and unpredictable terrains. To achieve this, I concentrated on developing state estimation algorithms that integrate real-time sensor data, leveraging multiple sensor streams for enhanced precision and reliability. This integration is critical to achieving dynamic motion control and autonomous operation, allowing robots to make immediate adjustments to their gait and trajectory in response to environmental changes. Additionally, I explored the use of Lie groups to design state estimation frameworks that employ both filtering and smoothing techniques. The filtering approach offers real-time responsiveness by using current sensor data to update the robot’s state, while the smoothing approach optimizes over a set of past states for improved accuracy. The primary outcomes of my research are embodied in three major contributions. The first is MUSE, a MUlti-sensor State Estimator designed specifically for quadruped robots. MUSE integrates data from a range of sensors, including Inertial Measurement Units (IMUs), encoders, force/torque sensors, cameras, and LiDARs, to provide accurate, reliable, and real-time state estimation even in challenging real-world environments with uneven or slippery surfaces. MUSE incorporates a dedicated slip detection module, that enables the robot to detect slippery terrain and correct the estimate by discarding the unreliable leg odometry measurements. Additionally, MUSE is built to be modular and flexible, allowing it to interface with various robot platforms and sensor configurations, making it adaptable to different applications and terrains. The real-time capabilities were demonstrated in real-time operations, where MUSE provided online feedback to the locomotion controller. Furthermore, MUSE is developed as an open-source tool, encouraging other researchers and engineers to use, modify, and build upon this work to further advance the field. The second major contribution is the development of frameworks for invariant state estimation, specifically a multi-sensor Invariant Extended Kalman Filter (InEKF) and an Invariant Smoother (IS). Both frameworks utilize Lie groups to incorporate leg kinematics, LiDAR positional data, and GPS coordinates (when available) to refine the robot’s state estimate and determine its global position. These frameworks are among the first to successfully integrate both proprioceptive (internal) and exteroceptive (external) measurements for invariant state estimation in legged robots, representing a significant innovation in the field. The third and final major contribution is the extensive testing of the proposed algorithms on multiple robots of varying sizes, ranging from the 21 kg Unitree Aliengo to the 90 kg HyQ robot from the Italian Institute of Technology. Looking ahead, future research will focus on enhancing individual components of these frameworks to improve overall estimation performance. For instance, advanced terrain estimation will play a key role in achieving fully autonomous operations. Estimating parameters such as the friction coefficient, as well as the geometrical and physical properties of the terrain (e.g. inclination and softness), will enable the robot to make real-time adjustments based on the terrain it encounters. Additionally, developing increasingly reliable mapping techniques will support long-term autonomy by allowing the robot to build a comprehensive understanding of its environment, reducing its reliance on external guidance or teleoperation. These advancements are expected to push the boundaries of autonomy in legged robots, allowing them to navigate complex environments independently, maintain stability on challenging terrain, and perform sophisticated tasks with minimal human intervention.File | Dimensione | Formato | |
---|---|---|---|
phdunige_5180549.pdf
embargo fino al 20/02/2026
Dimensione
16.04 MB
Formato
Adobe PDF
|
16.04 MB | Adobe PDF |
I documenti in UNITESI sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.
https://hdl.handle.net/20.500.14242/193704
URN:NBN:IT:UNIGE-193704