Autonomous vehicles (AVs) are required to be able to perceive and locate their surroundings accurately in dynamic environments. Using LiDAR and odomotry data, this dissertation proposes a novel framework for classifying tracks into static and dynamic categories and using static tracks as reliable landmarks for ego vehicle localization. This research is based on the principles of multi-target tracking (MTT), Growing Neural Gas (GNG) clustering, and Dynamic Bayesian Networks (DBN), integrating advanced algorithms such as Joint Probabilistic Data Association (JPDA) and Markov Jump Particle Filter (MJPF). In the training phase, LiDAR and odometry data were used to classify tracks based on relative motion patterns, resulting in an 87% classification accuracy. During the localization process, static tracks are identified as invariant landmarks, and dynamic tracks are excluded due to their variability. The framework uses the classified static tracks as reference points for predicting the trajectory of ego vehicle during the testing phase. Localization results are initially obtained based on individual track predictions using MJPF in the testing phase. Afterwards, interaction dictionaries are combined to perform localization under scenarios, such as simultaneous multi-track interactions and periods with no observations and single track interaction. Results from experiments, which validate the framework’s adaptability to real-world autonomous navigation scenarios, demonstrate that the framework is capable of ob taining accurate localization without external odometry updates. In this research, reliable classification techniques are combined with an adaptable localization strategy, advancing the development of safe and efficient AVs.

Ego Vehicle Localization in Dynamic Environments Using LiDAR-Based Classification of Static and Dynamic Tracks via Probabilistic Graphical Models

ADNAN, MUHAMMAD
2025

Abstract

Autonomous vehicles (AVs) are required to be able to perceive and locate their surroundings accurately in dynamic environments. Using LiDAR and odomotry data, this dissertation proposes a novel framework for classifying tracks into static and dynamic categories and using static tracks as reliable landmarks for ego vehicle localization. This research is based on the principles of multi-target tracking (MTT), Growing Neural Gas (GNG) clustering, and Dynamic Bayesian Networks (DBN), integrating advanced algorithms such as Joint Probabilistic Data Association (JPDA) and Markov Jump Particle Filter (MJPF). In the training phase, LiDAR and odometry data were used to classify tracks based on relative motion patterns, resulting in an 87% classification accuracy. During the localization process, static tracks are identified as invariant landmarks, and dynamic tracks are excluded due to their variability. The framework uses the classified static tracks as reference points for predicting the trajectory of ego vehicle during the testing phase. Localization results are initially obtained based on individual track predictions using MJPF in the testing phase. Afterwards, interaction dictionaries are combined to perform localization under scenarios, such as simultaneous multi-track interactions and periods with no observations and single track interaction. Results from experiments, which validate the framework’s adaptability to real-world autonomous navigation scenarios, demonstrate that the framework is capable of ob taining accurate localization without external odometry updates. In this research, reliable classification techniques are combined with an adaptable localization strategy, advancing the development of safe and efficient AVs.
16-dic-2025
Inglese
Prof David Martin Gomez
REGAZZONI, CARLO
REGAZZONI, CARLO
Università degli studi di Genova
File in questo prodotto:
File Dimensione Formato  
phdunige_4623500.pdf

accesso aperto

Licenza: Tutti i diritti riservati
Dimensione 3.72 MB
Formato Adobe PDF
3.72 MB Adobe PDF Visualizza/Apri

I documenti in UNITESI sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/20.500.14242/353229
Il codice NBN di questa tesi è URN:NBN:IT:UNIGE-353229