Nowadays the development of technologies strictly connected to humans increased exponentially. Smartphones, smart bracelets and any sorts of Internet of Things (IoT) devices permeate our daily life. Realistically, in the near future, technologies related to smart homes will further increase, and any kind of sensors, domestic robots, computers and wearable devices will share the home environment with us. Therefore, the importance to be aware of human behaviors is beyond doubt. Human activity recognition plays an important role in the context of ambient assisted living, providing useful tools to improve people quality of life. This dissertation investigates and discusses the use of a human activity recognition system based on depth camera for assistive robotics. In particular, the thesis deals with different aspects of the main topic, ranging from classification performances of daily living activity, application to social interactions, integration with robotics by means of Cloud technology, and a combined approach with a wearable sensor. The developed human activity recognition system is based on skeleton data and represents an activity using a set of few and basic postures. Machine learning techniques are employed to extract key poses and to perform the classification. The system is evaluated on two public datasets for daily living activity. An adaptation to address social activity interactions is also presented. In addition, the use of Cloud technologies to integrate the recognition module with a robotic system is investigated. The robotic system is evaluated with challenging tests carried out in realistic environment. Finally, a multimodal approach, which extends the system with the introduction of features extracted from a wearable device, is proposed. Results from the experimental tests show that a system having the described features is feasible for the real application in assistive contexts.

Human Activity Recognition based on Depth Camera for Assistive Robotics

2017

Abstract

Nowadays the development of technologies strictly connected to humans increased exponentially. Smartphones, smart bracelets and any sorts of Internet of Things (IoT) devices permeate our daily life. Realistically, in the near future, technologies related to smart homes will further increase, and any kind of sensors, domestic robots, computers and wearable devices will share the home environment with us. Therefore, the importance to be aware of human behaviors is beyond doubt. Human activity recognition plays an important role in the context of ambient assisted living, providing useful tools to improve people quality of life. This dissertation investigates and discusses the use of a human activity recognition system based on depth camera for assistive robotics. In particular, the thesis deals with different aspects of the main topic, ranging from classification performances of daily living activity, application to social interactions, integration with robotics by means of Cloud technology, and a combined approach with a wearable sensor. The developed human activity recognition system is based on skeleton data and represents an activity using a set of few and basic postures. Machine learning techniques are employed to extract key poses and to perform the classification. The system is evaluated on two public datasets for daily living activity. An adaptation to address social activity interactions is also presented. In addition, the use of Cloud technologies to integrate the recognition module with a robotic system is investigated. The robotic system is evaluated with challenging tests carried out in realistic environment. Finally, a multimodal approach, which extends the system with the introduction of features extracted from a wearable device, is proposed. Results from the experimental tests show that a system having the described features is feasible for the real application in assistive contexts.
22-nov-2017
Italiano
CAVALLO, FILIPPO
Scuola Superiore di Studi Universitari e Perfezionamento "S. Anna" di Pisa
File in questo prodotto:
File Dimensione Formato  
PhD_thesis_Manzi_final_1.pdf

accesso aperto

Tipologia: Altro materiale allegato
Dimensione 15.03 MB
Formato Adobe PDF
15.03 MB Adobe PDF Visualizza/Apri
liberatoria_esposito.pdf

Open Access dal 23/06/2020

Tipologia: Altro materiale allegato
Dimensione 886.71 kB
Formato Adobe PDF
886.71 kB Adobe PDF Visualizza/Apri
liberatoria_moschetti.pdf

Open Access dal 23/06/2020

Tipologia: Altro materiale allegato
Dimensione 880.83 kB
Formato Adobe PDF
880.83 kB Adobe PDF Visualizza/Apri
PhD_thesis_Manzi_final_1.pdf

Open Access dal 23/06/2020

Tipologia: Altro materiale allegato
Dimensione 15.03 MB
Formato Adobe PDF
15.03 MB Adobe PDF Visualizza/Apri

I documenti in UNITESI sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/20.500.14242/138887
Il codice NBN di questa tesi è URN:NBN:IT:SSSUP-138887