Autonomous and reliable robotic grasping is a desirable functionality in robotic manipulation and is relevant in a broad range of applications like service robots, pick and place in manufacturing and logistics, agriculture, and others. Tasks that feel ordinary to humans, like chopping vegetables, loading the dishwasher, and folding laundry, remain incredibly challenging for robots since they are multidisciplinary tasks that require the interaction between the mechatronic design of grippers to the higher-level domains of perception, planning, and control. In addition, robotic grasping needs many steps to succeed, from the detection of the objects to the grasp synthesis to the grasp and holding of the target during the operation, considering the surrounding environment. The presented dissertation encompasses the perception, planning, and control aspects of robotic grasping and manipulation following the new requirements of autonomy, intelligence, and flexibility enacted by the Industry 4.0 paradigm. The thesis proposes contributions for each aspect and for the complete grasping problem. The objectives in common to all the contributions are grasping a wide variety of objects, including soft or fragile objects, adapting to the changing environments, and having a close eye on industrial implications. In particular, for the perception part, the essay discusses a new visual approach for 6D pose estimation of objects at the category level setting and a tactile technique based on FBG optical fiber sensing for retrieving forces during the grasping phase. Concerning the planning aspect, the manuscript shows a one-shot high-level planner that exploits Graph Neural Network to abstract the task-specific rules and a versatile task-oriented grasp planner modeled as an optimization problem. For the control component, the document presents a self-sensing capacitance circuit for the electroactive gripper to check and monitor the success of the grasping task and a slippage-detection algorithm exploiting FBG optical fiber sensors. In addition, the thesis describes robotic grasping algorithms for several grippers, including parallel-jaw, suction, multi-dof soft multimodal grippers, electroactive and UJG gripper. The work is concluded with a novel benchmark for bin cluttered scenarios that introduces a novel evaluation metric, which does not constraint the participants to fix the object in a precise position, still guaranteeing a fair comparison, mimicking the flexibility of the new industrial paradigms and taking into account the objects' difficulties during the scoring phase.

Robotic Grasping and Manipulation: Perception, Planning, and Control for Industry 4.0 and Logistics.

D'AVELLA, SALVATORE
2023

Abstract

Autonomous and reliable robotic grasping is a desirable functionality in robotic manipulation and is relevant in a broad range of applications like service robots, pick and place in manufacturing and logistics, agriculture, and others. Tasks that feel ordinary to humans, like chopping vegetables, loading the dishwasher, and folding laundry, remain incredibly challenging for robots since they are multidisciplinary tasks that require the interaction between the mechatronic design of grippers to the higher-level domains of perception, planning, and control. In addition, robotic grasping needs many steps to succeed, from the detection of the objects to the grasp synthesis to the grasp and holding of the target during the operation, considering the surrounding environment. The presented dissertation encompasses the perception, planning, and control aspects of robotic grasping and manipulation following the new requirements of autonomy, intelligence, and flexibility enacted by the Industry 4.0 paradigm. The thesis proposes contributions for each aspect and for the complete grasping problem. The objectives in common to all the contributions are grasping a wide variety of objects, including soft or fragile objects, adapting to the changing environments, and having a close eye on industrial implications. In particular, for the perception part, the essay discusses a new visual approach for 6D pose estimation of objects at the category level setting and a tactile technique based on FBG optical fiber sensing for retrieving forces during the grasping phase. Concerning the planning aspect, the manuscript shows a one-shot high-level planner that exploits Graph Neural Network to abstract the task-specific rules and a versatile task-oriented grasp planner modeled as an optimization problem. For the control component, the document presents a self-sensing capacitance circuit for the electroactive gripper to check and monitor the success of the grasping task and a slippage-detection algorithm exploiting FBG optical fiber sensors. In addition, the thesis describes robotic grasping algorithms for several grippers, including parallel-jaw, suction, multi-dof soft multimodal grippers, electroactive and UJG gripper. The work is concluded with a novel benchmark for bin cluttered scenarios that introduces a novel evaluation metric, which does not constraint the participants to fix the object in a precise position, still guaranteeing a fair comparison, mimicking the flexibility of the new industrial paradigms and taking into account the objects' difficulties during the scoring phase.
21-lug-2023
Italiano
Benchmarking
Deep Learning for Grasping
Dexterous Manipulation
Industry 4.0
Logistics
Multimodal Grasping
One-shot Imitation Learning
Pose Estimation
Robotic Grasping
Tactile Sensing
Task-oriented Grasping
Vision for Grasping
TRIPICCHIO, PAOLO
AVIZZANO, CARLO ALBERTO
BIANCHI, MATTEO
SUN, YU
File in questo prodotto:
File Dimensione Formato  
PhD_Thesis_SalvatoreDAvella.pdf

embargo fino al 07/07/2096

Dimensione 164.48 MB
Formato Adobe PDF
164.48 MB Adobe PDF

I documenti in UNITESI sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/20.500.14242/217477
Il codice NBN di questa tesi è URN:NBN:IT:SSSUP-217477