This thesis explores the theory of associative memory in recurrent neural networks through the lens of dynamical systems. Starting from the classical literature on voltage-based and firing-rate models, we analyze their fundamental dynamical properties, including the existence of fixed points and the conditions for local and global stability. These results provide a unified framework to understand how recurrent networks store and retrieve information. Building on this foundation, we introduce a novel framework for input-driven memory retrieval, addressing a long-standing gap in the study of associative networks. We investigate this mechanism in both deterministic and stochastic settings, showing how external inputs and intrinsic noise can guide transitions between memories, smooth retrieval dynamics, and shape the underlying energy landscape. Finally, we extend classical estimates of memory storage capacity to a broader class of activation functions, thereby widening the applicability of associative memory models. Together, these contributions advance the understanding of recurrent neural networks as dynamical systems, bridging theoretical neuroscience with modern machine learning perspectives.

Associative memory and recurrent neural networks: a dynamical systems approach

BETTETI, SIMONE
2026

Abstract

This thesis explores the theory of associative memory in recurrent neural networks through the lens of dynamical systems. Starting from the classical literature on voltage-based and firing-rate models, we analyze their fundamental dynamical properties, including the existence of fixed points and the conditions for local and global stability. These results provide a unified framework to understand how recurrent networks store and retrieve information. Building on this foundation, we introduce a novel framework for input-driven memory retrieval, addressing a long-standing gap in the study of associative networks. We investigate this mechanism in both deterministic and stochastic settings, showing how external inputs and intrinsic noise can guide transitions between memories, smooth retrieval dynamics, and shape the underlying energy landscape. Finally, we extend classical estimates of memory storage capacity to a broader class of activation functions, thereby widening the applicability of associative memory models. Together, these contributions advance the understanding of recurrent neural networks as dynamical systems, bridging theoretical neuroscience with modern machine learning perspectives.
19-feb-2026
Inglese
ZAMPIERI, SANDRO
Università degli studi di Padova
File in questo prodotto:
File Dimensione Formato  
tesi_Simone_Betteti-finale.pdf

accesso aperto

Licenza: Tutti i diritti riservati
Dimensione 9.73 MB
Formato Adobe PDF
9.73 MB Adobe PDF Visualizza/Apri

I documenti in UNITESI sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/20.500.14242/361057
Il codice NBN di questa tesi è URN:NBN:IT:UNIPD-361057