This thesis explores the enhancement of Human-Robot Interaction (HRI) through AI-driven biometric systems, focusing on industrial and healthcare applications. Key contributions include developing intelligent systems that use emotional intelligence (EI), EEG signals, and multimodal datasets to improve collaboration, safety, and efficiency in human-robot workflows. In industrial settings, an EI-Vision Transformer (ViT)-based system was implemented to monitor operator attention levels using facial expressions and hand gestures. This system optimizes collaborative robot (cobot) trajectory planning, enhancing safety and reducing downtime. The study also addresses issues in Brain-Computer Interface (BCI) technologies for healthcare applications, provides the REMEMO dataset for emotion recognition based on self-evoked memories, and enables breakthroughs in emotion-driven assistive technologies. The thesis also presents MOVING, a novel multimodal dataset that integrates EEG signals and virtual glove hand tracking for motor rehabilitation tasks, advancing research in assistive devices and rehabilitation robotics. In addition, deep learning architectures were analyzed for their efficacy in classifying motor execution and emotional states, highlighting the potential of compact and efficient models like EEGNet for real-time applications. Future research directions include extending datasets to encompass diverse subjects, refining signal processing methods, and exploring physiological interpretations to further bridge the gap between humans and robots. These advancements underline the transformative role of AI in fostering intuitive, safe, and collaborative human-robot ecosystems.

Enhancing Human-Robot Interaction (HRI) through AI-powered Biometrics

MATTEI, ENRICO
2025

Abstract

This thesis explores the enhancement of Human-Robot Interaction (HRI) through AI-driven biometric systems, focusing on industrial and healthcare applications. Key contributions include developing intelligent systems that use emotional intelligence (EI), EEG signals, and multimodal datasets to improve collaboration, safety, and efficiency in human-robot workflows. In industrial settings, an EI-Vision Transformer (ViT)-based system was implemented to monitor operator attention levels using facial expressions and hand gestures. This system optimizes collaborative robot (cobot) trajectory planning, enhancing safety and reducing downtime. The study also addresses issues in Brain-Computer Interface (BCI) technologies for healthcare applications, provides the REMEMO dataset for emotion recognition based on self-evoked memories, and enables breakthroughs in emotion-driven assistive technologies. The thesis also presents MOVING, a novel multimodal dataset that integrates EEG signals and virtual glove hand tracking for motor rehabilitation tasks, advancing research in assistive devices and rehabilitation robotics. In addition, deep learning architectures were analyzed for their efficacy in classifying motor execution and emotional states, highlighting the potential of compact and efficient models like EEGNet for real-time applications. Future research directions include extending datasets to encompass diverse subjects, refining signal processing methods, and exploring physiological interpretations to further bridge the gap between humans and robots. These advancements underline the transformative role of AI in fostering intuitive, safe, and collaborative human-robot ecosystems.
27-mar-2025
Inglese
MANES, COSTANZO
PLACIDI, GIUSEPPE
DI RUSCIO, DAVIDE
Università degli Studi dell'Aquila
File in questo prodotto:
File Dimensione Formato  
enrico_mattei_thesis_phd_final.pdf

accesso aperto

Dimensione 27.05 MB
Formato Adobe PDF
27.05 MB Adobe PDF Visualizza/Apri
enrico_mattei_thesis_phd_final_1.pdf

accesso aperto

Dimensione 27.05 MB
Formato Adobe PDF
27.05 MB Adobe PDF Visualizza/Apri

I documenti in UNITESI sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/20.500.14242/202549
Il codice NBN di questa tesi è URN:NBN:IT:UNIVAQ-202549