The human brain is one of the most complex systems. Millions of neurons interact, creating billions of synapses, which produce high levels of intelligence and reasoning. While modern Artificial Neural Networks (ANNs) achieve high levels of performance in several tasks, the ability of the human brain (and natural brains in general) to adapt to new situations or learn multiple tasks remains unmatched. These capabilities of the natural brain come from the fact that several processes shape it over a lifetime. One such process is plasticity: the ability of neurons to adapt their synapses. The varying levels of myelination and synaptic development change the behavior of neurons and the brain. In this dissertation, we attempt to imitate this process by employing Hebbian Learning, a model of plasticity, in ANNs. This model is inspired by Hebb’s theory, which can be summarized as: “If one neuron actively contributes to the firing of a second neuron, their connection strengthens”. In particular, we employed the ABCD model in two categories of control tasks: Voxel-based Soft Robots (VSRs) and control tasks from the Gymnasium library. First, we studied how the ABCD model affects the VSR performances and how it responds to damage in the robots, finding that agents equipped with Synaptic-centric Hebbian Learning (SHL) model are better at coping with such impairments. Next, in VSR tasks, we found that the same SHL can specialize network weights based on the voxel’s position within the body. Thus, SHL can specialize an ANN into functionally distinct networks. Then, we combined the SHL model with a pruning mechanism to simulate the synaptogenesis process that naturally occurs in the human brain. This resulted in the Self-Building Neural Network (SBNN) model, which can self-adapt its structure based on the agent’s experiences during the task. Finally, we explored a new variant of the SHL model, drawing inspiration from the original Hebb’s theory and considering plasticity as an interaction between neurons rather than solely a property of the synapses. We moved the rule from the synapses to the neurons, vastly reducing the number of parameters without losing performance, hence we call this model Neuron-centric Hebbian Learning (NcHL)

Towards adaptive neural networks: Bio-inspired principles for functional and structural plasticity

Ferigo, Andrea
2025

Abstract

The human brain is one of the most complex systems. Millions of neurons interact, creating billions of synapses, which produce high levels of intelligence and reasoning. While modern Artificial Neural Networks (ANNs) achieve high levels of performance in several tasks, the ability of the human brain (and natural brains in general) to adapt to new situations or learn multiple tasks remains unmatched. These capabilities of the natural brain come from the fact that several processes shape it over a lifetime. One such process is plasticity: the ability of neurons to adapt their synapses. The varying levels of myelination and synaptic development change the behavior of neurons and the brain. In this dissertation, we attempt to imitate this process by employing Hebbian Learning, a model of plasticity, in ANNs. This model is inspired by Hebb’s theory, which can be summarized as: “If one neuron actively contributes to the firing of a second neuron, their connection strengthens”. In particular, we employed the ABCD model in two categories of control tasks: Voxel-based Soft Robots (VSRs) and control tasks from the Gymnasium library. First, we studied how the ABCD model affects the VSR performances and how it responds to damage in the robots, finding that agents equipped with Synaptic-centric Hebbian Learning (SHL) model are better at coping with such impairments. Next, in VSR tasks, we found that the same SHL can specialize network weights based on the voxel’s position within the body. Thus, SHL can specialize an ANN into functionally distinct networks. Then, we combined the SHL model with a pruning mechanism to simulate the synaptogenesis process that naturally occurs in the human brain. This resulted in the Self-Building Neural Network (SBNN) model, which can self-adapt its structure based on the agent’s experiences during the task. Finally, we explored a new variant of the SHL model, drawing inspiration from the original Hebb’s theory and considering plasticity as an interaction between neurons rather than solely a property of the synapses. We moved the rule from the synapses to the neurons, vastly reducing the number of parameters without losing performance, hence we call this model Neuron-centric Hebbian Learning (NcHL)
26-mar-2025
Inglese
Iacca, Giovanni
Università degli studi di Trento
TRENTO
111
File in questo prodotto:
File Dimensione Formato  
phd_unitn_Ferigo_Andrea.pdf

accesso aperto

Dimensione 7.95 MB
Formato Adobe PDF
7.95 MB Adobe PDF Visualizza/Apri

I documenti in UNITESI sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/20.500.14242/200916
Il codice NBN di questa tesi è URN:NBN:IT:UNITN-200916