This dissertation develops novel randomized techniques for low-rank approximation of matrices and tensors, focusing on streaming and structured methods. We proved the numerical stability of the Nyström approximation with small truncations, confirming its reliability. For tensors, we introduced the multilinear Nyström algorithm and a sequential variant, avoiding costly orthogonalizations through single-pass sketching. These methods achieve near-optimal accuracy and stability, with potential improvements via structured sketching. In structured sketching, we extended the matrix Chernoff bound to Kronecker product structures. We also developed the tree tensor network Nyström algorithm, the first streaming method for TTN format, with a sequential variant for efficiency. Future work could refine rounding techniques and extend to non-acyclic tensors. Lastly, we proposed TT-sGMRES, a sketched TT-GMRES variant using randomization to reduce orthogonalization costs and manage tensor ranks efficiently. It improves storage efficiency and rivals state-of-the-art solvers like AMEn. This thesis advances randomized low-rank approximation with both theoretical insights and practical algorithms for large-scale computations.
Randomized techniques for low-rank approximation of matrix and tensors with applications
BUCCI, ALBERTO
2025
Abstract
This dissertation develops novel randomized techniques for low-rank approximation of matrices and tensors, focusing on streaming and structured methods. We proved the numerical stability of the Nyström approximation with small truncations, confirming its reliability. For tensors, we introduced the multilinear Nyström algorithm and a sequential variant, avoiding costly orthogonalizations through single-pass sketching. These methods achieve near-optimal accuracy and stability, with potential improvements via structured sketching. In structured sketching, we extended the matrix Chernoff bound to Kronecker product structures. We also developed the tree tensor network Nyström algorithm, the first streaming method for TTN format, with a sequential variant for efficiency. Future work could refine rounding techniques and extend to non-acyclic tensors. Lastly, we proposed TT-sGMRES, a sketched TT-GMRES variant using randomization to reduce orthogonalization costs and manage tensor ranks efficiently. It improves storage efficiency and rivals state-of-the-art solvers like AMEn. This thesis advances randomized low-rank approximation with both theoretical insights and practical algorithms for large-scale computations.File | Dimensione | Formato | |
---|---|---|---|
PHD_Activity_Report_and_Publications.pdf
non disponibili
Dimensione
34.21 kB
Formato
Adobe PDF
|
34.21 kB | Adobe PDF | |
PHD_THESIS.pdf
accesso aperto
Dimensione
2.38 MB
Formato
Adobe PDF
|
2.38 MB | Adobe PDF | Visualizza/Apri |
Summary_of_PHD_THESIS.pdf
accesso aperto
Dimensione
23.16 kB
Formato
Adobe PDF
|
23.16 kB | Adobe PDF | Visualizza/Apri |
I documenti in UNITESI sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.
https://hdl.handle.net/20.500.14242/216542
URN:NBN:IT:UNIPI-216542