The presentPhDresearchexplorestheintegrationofvisiondevicesandintelligentsystems to monitorandenhancehumanwell-beinginhealthcareandmanufacturingcontexts,start- ing fromthestandardsproposedinIndustry4.0andaimingtofollowtheprinciplesofthe novelIndustry5.0.Depthsensorsanddeeplearningtechnologieshavebeenexploitedtoad- dress thecriticalaspectsofhumanmobilityassessmentandactionsegmentationinreal,non- simulatedscenarios.TheMicrosoftAzureKinect,astate-of-the-artdepthsensor,hasbeen selectedasakeyinstrumentfordatacollection,andinnovativecameracalibrationmethods havebeendevelopedtoensuretheaccuracyandreliabilityofthegathereddata. Withintherealmofhealthcare,theresearchactivityaddressesthesubstantialchallenges posedbyneurodegenerativediseasesinthewell-beingofolderindividuals.Thispartofthe study focusesonmonitoringandassessingthemobilityofelderlypatients,aimingtosupport remote diagnosisandimprovetheirqualityoflife.Traditionalmobilitytests,administered byhealthcareprofessionals,areessentialforevaluatingmovementskills.Nevertheless,such techniquesoftensufferfromhumansubjectivity,whichcouldleadtoerrorsintheassess- ments. Toaddresssuchissues,video-basedsystemshavebeenstudied,aimingtoremotely monitor andobjectivelyevaluatemobility,reducingtheburdenonelderlypatients. In thecontextofmanufacturing,humanactionsarepivotalinenhancingoperationalef- ficiency,productivity,andsafetyinmanufacturingenvironments.Suchchallengeshaveled to theincreasinguseofindustrialroboticsolutions,mainlyincludingcollaborativerobots, which canshareacommonworkspacewithhumans,carryingouttheirrespectivetaskssimul- taneously.Thispartoftheresearchdelvesintothesegmentationofhumantasksforintel- ligent manufacturingsystems,exploringtheintegrationofvisiondevicesanddeeplearning technologiestoimprovetheefficiencyandaccuracyofmanufacturingprocesses.Ingeneral, the studyofsuchsystemsisaimedatcreatingcomfortableworkenvironments,adaptable to theneedsandabilitiesofindividualpeople,increasingthewell-beingofoperatorsina human-centeredfactoryconcept. The maingoalofthepresentstudyistoevaluatetheeffectivenessofmachinelearning and deeplearningmodelsformobilityassessmentandactionsegmentation,todetermine their suitabilityforhumanmonitoring.However,anotablegapintheliteratureisidentified: the absenceofdatasetsrepresentinghumanactionsinrealisticenvironments.Tobridgethis gap,theresearchincludesthecreationandvalidationofdatasetscapturinghumanactions in healthcareandmanufacturingscenarios,emphasizingtheimportanceofgeneralization acrossdifferentlocations.Byaddressingtheuniquechallengesinbothhealthcareandman- ufacturing,thisstudycontributestothedevelopmentofintelligentsystemsthatpromote human well-beingandenhanceoperationalefficiency,aimingtoalignwiththeparadigmsof Industry 5.0.

Vision devices and intelligent systems for monitoring the well-being of humans in healthcare and manufacturing

Romeo, Laura
2024

Abstract

The presentPhDresearchexplorestheintegrationofvisiondevicesandintelligentsystems to monitorandenhancehumanwell-beinginhealthcareandmanufacturingcontexts,start- ing fromthestandardsproposedinIndustry4.0andaimingtofollowtheprinciplesofthe novelIndustry5.0.Depthsensorsanddeeplearningtechnologieshavebeenexploitedtoad- dress thecriticalaspectsofhumanmobilityassessmentandactionsegmentationinreal,non- simulatedscenarios.TheMicrosoftAzureKinect,astate-of-the-artdepthsensor,hasbeen selectedasakeyinstrumentfordatacollection,andinnovativecameracalibrationmethods havebeendevelopedtoensuretheaccuracyandreliabilityofthegathereddata. Withintherealmofhealthcare,theresearchactivityaddressesthesubstantialchallenges posedbyneurodegenerativediseasesinthewell-beingofolderindividuals.Thispartofthe study focusesonmonitoringandassessingthemobilityofelderlypatients,aimingtosupport remote diagnosisandimprovetheirqualityoflife.Traditionalmobilitytests,administered byhealthcareprofessionals,areessentialforevaluatingmovementskills.Nevertheless,such techniquesoftensufferfromhumansubjectivity,whichcouldleadtoerrorsintheassess- ments. Toaddresssuchissues,video-basedsystemshavebeenstudied,aimingtoremotely monitor andobjectivelyevaluatemobility,reducingtheburdenonelderlypatients. In thecontextofmanufacturing,humanactionsarepivotalinenhancingoperationalef- ficiency,productivity,andsafetyinmanufacturingenvironments.Suchchallengeshaveled to theincreasinguseofindustrialroboticsolutions,mainlyincludingcollaborativerobots, which canshareacommonworkspacewithhumans,carryingouttheirrespectivetaskssimul- taneously.Thispartoftheresearchdelvesintothesegmentationofhumantasksforintel- ligent manufacturingsystems,exploringtheintegrationofvisiondevicesanddeeplearning technologiestoimprovetheefficiencyandaccuracyofmanufacturingprocesses.Ingeneral, the studyofsuchsystemsisaimedatcreatingcomfortableworkenvironments,adaptable to theneedsandabilitiesofindividualpeople,increasingthewell-beingofoperatorsina human-centeredfactoryconcept. The maingoalofthepresentstudyistoevaluatetheeffectivenessofmachinelearning and deeplearningmodelsformobilityassessmentandactionsegmentation,todetermine their suitabilityforhumanmonitoring.However,anotablegapintheliteratureisidentified: the absenceofdatasetsrepresentinghumanactionsinrealisticenvironments.Tobridgethis gap,theresearchincludesthecreationandvalidationofdatasetscapturinghumanactions in healthcareandmanufacturingscenarios,emphasizingtheimportanceofgeneralization acrossdifferentlocations.Byaddressingtheuniquechallengesinbothhealthcareandman- ufacturing,thisstudycontributestothedevelopmentofintelligentsystemsthatpromote human well-beingandenhanceoperationalefficiency,aimingtoalignwiththeparadigmsof Industry 5.0.
2024
Inglese
Perri, Anna Gina
Ciminelli, Caterina
Politecnico di Bari
File in questo prodotto:
File Dimensione Formato  
36 ciclo-ROMEO Laura.pdf

accesso aperto

Dimensione 17.16 MB
Formato Adobe PDF
17.16 MB Adobe PDF Visualizza/Apri

I documenti in UNITESI sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/20.500.14242/65228
Il codice NBN di questa tesi è URN:NBN:IT:POLIBA-65228