In this work we study performances of different machine learning models by focusing on regularization properties in order to explain different phenomena that are observed in practice. We consider linear models on possibly infinite dimensionale feature space that are trained by optimizing an empirical mean squared errror. We study regularization properties of accelerated method like Nesterov or $ u$-method and properties of interpolating estimator where the main regularization sources vanish to zero and explain different behaviour which can be seen in practical applications.

On the Role of Regularization in Machine Learning: Classical Theory, Computational Aspects and Modern Regimes

PAGLIANA, NICOLO'
2022

Abstract

In this work we study performances of different machine learning models by focusing on regularization properties in order to explain different phenomena that are observed in practice. We consider linear models on possibly infinite dimensionale feature space that are trained by optimizing an empirical mean squared errror. We study regularization properties of accelerated method like Nesterov or $ u$-method and properties of interpolating estimator where the main regularization sources vanish to zero and explain different behaviour which can be seen in practical applications.
31-mag-2022
Inglese
DE VITO, ERNESTO
ROSASCO, LORENZO
VIGNI, STEFANO
Università degli studi di Genova
File in questo prodotto:
File Dimensione Formato  
phdunige_3943821.pdf

accesso aperto

Dimensione 4 MB
Formato Adobe PDF
4 MB Adobe PDF Visualizza/Apri

I documenti in UNITESI sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/20.500.14242/67682
Il codice NBN di questa tesi è URN:NBN:IT:UNIGE-67682