Several technological solutions have been proposed to face the future demographic challenge and guarantee good-quality and sustainable health care services. Among the different devices, wearable sensors and robots are gaining a lot of attention. Thanks to low cost and miniaturisation, the former have been investigated widely in Ambient Assisted Living scenarios to measure physiological and movements parameters. Several examples of wearable sensors are already daily used by people. The latter have been proposed in the last years to assist elderly people at home, being able to perform different types of tasks and to interact both physically and socially with humans. Robots should be able to assist persons at home, helping them in physical activities, as well as to entertain and monitor them. To increase the abilities of the robots, it is important to make them aware of the environment that surrounds them, especially when working with people. Robots must have a strong perception that allows them to interpret what users are doing to monitor elderly persons and be able to interact properly with them. Wearable sensors can be used to increase the perception abilities of robots, enhancing their monitoring and interacting capabilities. In this context, this dissertation describes the use of wearable sensors to recognise human movements and gestures that can be used by the robot to monitor people and interact with them. Regarding monitoring tasks, hand wearable sensors were used to recognise daily gestures. Initially, the performances of wearable sensors alone were investigated by mean of an extensive experimentation in a realistic environment to assess whether these sensors can give useful information. In particular, a mix of hand-oriented gestures and eating modes was chosen, all involving the movement of the hand to the head/mouth. Despite the similarity of the gestures, the addition of a sensor worn on the index finger to the commonly used wrist sensor allowed to recognise the different gestures, both with supervised and unsupervised machine learning algorithms, maintaining however low obtrusiveness. Then, the use of the same sensors was evaluated in a system composed of the wearable sensors and a depth camera placed on a mobile robot. In this case, it was possible to see how sensors placed on the user can help the robot to improve its perception ability in more realistic conditions. An experimentation was performed placing the moving robot in front and sideways with respect to the user, thus adding noise to the data and occluding the dominant arm of the user. In this case, the wearable sensors placed on the wrist and on the index finger provided the useful information to increase the accuracy in distinguishing among ten different activities, overcoming also the problem of the occlusion that can affect vision sensors. Finally, the feasibility and perceived usability of a human-robot interaction task carried out by mean of wearable sensors were investigated. In particular, wearable sensors on the feet were used to evaluate real-time human gait parameters that were then used to control and modulate the robot motion in two different tasks. Tests were carried out with users, which expressed a positive evaluation of the performance of the system. In this case, the use of wearable sensors allowed to make the robot moving according to the user movements, without the need of links between the robot and the persons. Through the implementation and evaluation of these monitoring and interacting tasks, it can be seen how wearable sensors can increase the amount of information that the robot can perceive about the users. This dissertation is, therefore, a first step in the implementation of a system composed of wearable sensors and robot that can help people in daily life.

Wearable sensors for gesture recognition in assisted living applications

2017

Abstract

Several technological solutions have been proposed to face the future demographic challenge and guarantee good-quality and sustainable health care services. Among the different devices, wearable sensors and robots are gaining a lot of attention. Thanks to low cost and miniaturisation, the former have been investigated widely in Ambient Assisted Living scenarios to measure physiological and movements parameters. Several examples of wearable sensors are already daily used by people. The latter have been proposed in the last years to assist elderly people at home, being able to perform different types of tasks and to interact both physically and socially with humans. Robots should be able to assist persons at home, helping them in physical activities, as well as to entertain and monitor them. To increase the abilities of the robots, it is important to make them aware of the environment that surrounds them, especially when working with people. Robots must have a strong perception that allows them to interpret what users are doing to monitor elderly persons and be able to interact properly with them. Wearable sensors can be used to increase the perception abilities of robots, enhancing their monitoring and interacting capabilities. In this context, this dissertation describes the use of wearable sensors to recognise human movements and gestures that can be used by the robot to monitor people and interact with them. Regarding monitoring tasks, hand wearable sensors were used to recognise daily gestures. Initially, the performances of wearable sensors alone were investigated by mean of an extensive experimentation in a realistic environment to assess whether these sensors can give useful information. In particular, a mix of hand-oriented gestures and eating modes was chosen, all involving the movement of the hand to the head/mouth. Despite the similarity of the gestures, the addition of a sensor worn on the index finger to the commonly used wrist sensor allowed to recognise the different gestures, both with supervised and unsupervised machine learning algorithms, maintaining however low obtrusiveness. Then, the use of the same sensors was evaluated in a system composed of the wearable sensors and a depth camera placed on a mobile robot. In this case, it was possible to see how sensors placed on the user can help the robot to improve its perception ability in more realistic conditions. An experimentation was performed placing the moving robot in front and sideways with respect to the user, thus adding noise to the data and occluding the dominant arm of the user. In this case, the wearable sensors placed on the wrist and on the index finger provided the useful information to increase the accuracy in distinguishing among ten different activities, overcoming also the problem of the occlusion that can affect vision sensors. Finally, the feasibility and perceived usability of a human-robot interaction task carried out by mean of wearable sensors were investigated. In particular, wearable sensors on the feet were used to evaluate real-time human gait parameters that were then used to control and modulate the robot motion in two different tasks. Tests were carried out with users, which expressed a positive evaluation of the performance of the system. In this case, the use of wearable sensors allowed to make the robot moving according to the user movements, without the need of links between the robot and the persons. Through the implementation and evaluation of these monitoring and interacting tasks, it can be seen how wearable sensors can increase the amount of information that the robot can perceive about the users. This dissertation is, therefore, a first step in the implementation of a system composed of wearable sensors and robot that can help people in daily life.
22-nov-2017
Italiano
CAVALLO, FILIPPO
Scuola Superiore di Studi Universitari e Perfezionamento "S. Anna" di Pisa
File in questo prodotto:
File Dimensione Formato  
PhDThesisMoschetti_finale_con_modifiche.pdf

accesso aperto

Tipologia: Altro materiale allegato
Dimensione 17.58 MB
Formato Adobe PDF
17.58 MB Adobe PDF Visualizza/Apri
PhDThesisMoschetti_finale_con_modifiche.pdf

Open Access dal 23/06/2020

Tipologia: Altro materiale allegato
Dimensione 17.58 MB
Formato Adobe PDF
17.58 MB Adobe PDF Visualizza/Apri
UseOfImages_CelesteDiDonato.pdf

Open Access dal 23/06/2020

Tipologia: Altro materiale allegato
Dimensione 731.59 kB
Formato Adobe PDF
731.59 kB Adobe PDF Visualizza/Apri
UseOfImages_LauraBaigesSotos.pdf

Open Access dal 23/06/2020

Tipologia: Altro materiale allegato
Dimensione 737.91 kB
Formato Adobe PDF
737.91 kB Adobe PDF Visualizza/Apri
UseOfImages_TomasMecredy.pdf

Open Access dal 23/06/2020

Tipologia: Altro materiale allegato
Dimensione 745.68 kB
Formato Adobe PDF
745.68 kB Adobe PDF Visualizza/Apri

I documenti in UNITESI sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/20.500.14242/138891
Il codice NBN di questa tesi è URN:NBN:IT:SSSUP-138891