In this thesis, we describe block rational Krylov subspaces, highlighting their correspondence with rational matrices and generalizing important properties that apply to the nonblock case. We develop procedures based on block rational Krylov subspaces for solving Sylvester and tensor Sylvester equations, computing functions of Hermitian hierarchically semiseparable (HSS) matrices, and applying functions of Hermitian matrices to block vectors.In the context of Sylvester equations with lowrank righthand sides, we devise a new formulation of the residual obtained when projection methods based on block rational Krylov subspaces are utilized. This formulation allows the development of various adaptive pole selection strategies, offering a theoretical analysis of established heuristics as well as effective novel techniques.A natural extension is represented by tensor Sylvester equations, involving $d$ summands and $d$dimensional tensors for both the unknown and the righthand side. Methods based on projection onto Krylov subspaces typically assume a righthand side of rank one. In this thesis, we demonstrate how to apply block rational Krylov methods to handle righthand sides with low multilinear or tensor train rank. By extending the results established for Sylvester equations, we present a new formulation of the residual and several adaptive pole selection strategies for the tensor case as well.In the context of matrix functions, we utilize block rational Krylov methods in an unconventional manner. Our aim is to leverage the rapid convergence of block rational Krylov subspaces while circumventing costly operations such as solving large linear systems.We present a fast algorithm designed to approximate $f(A)$, where $A$ represents a Hermitian HSS matrix. We use a telescopic decomposition of $A$, inspired by the recent work of Levitt and Martinsson, allowing us to approximate $f(A)$ by recursively performing lowrank updates with block rational Krylov subspaces while keeping the size of the matrices involved in the rational Krylov subspaces small. In particular, no largescale linear system needs to be solved, yielding favorable complexity estimates and reduced execution times compared to existing methods.Finally, we develop a memoryefficient algorithm for computing $f(A)C$, where $A$ is a Hermitian matrix (not necessarily HSS), and $C$ is a block vector. Our approach combines a block Lanczos algorithm with a basis compression technique based on block rational Krylov subspaces involving only small matrices, enabling us to avoid storing the entire Lanczos basis, resulting in significant reductions in memory usage. Theoretical results demonstrate that, for a wide variety of functions, the proposed algorithm differs from block Lanczos by an error term that is typically negligible.
Block rational Krylov methods for matrix equations and matrix functions
CASULLI, Angelo Alberto
2024
Abstract
In this thesis, we describe block rational Krylov subspaces, highlighting their correspondence with rational matrices and generalizing important properties that apply to the nonblock case. We develop procedures based on block rational Krylov subspaces for solving Sylvester and tensor Sylvester equations, computing functions of Hermitian hierarchically semiseparable (HSS) matrices, and applying functions of Hermitian matrices to block vectors.In the context of Sylvester equations with lowrank righthand sides, we devise a new formulation of the residual obtained when projection methods based on block rational Krylov subspaces are utilized. This formulation allows the development of various adaptive pole selection strategies, offering a theoretical analysis of established heuristics as well as effective novel techniques.A natural extension is represented by tensor Sylvester equations, involving $d$ summands and $d$dimensional tensors for both the unknown and the righthand side. Methods based on projection onto Krylov subspaces typically assume a righthand side of rank one. In this thesis, we demonstrate how to apply block rational Krylov methods to handle righthand sides with low multilinear or tensor train rank. By extending the results established for Sylvester equations, we present a new formulation of the residual and several adaptive pole selection strategies for the tensor case as well.In the context of matrix functions, we utilize block rational Krylov methods in an unconventional manner. Our aim is to leverage the rapid convergence of block rational Krylov subspaces while circumventing costly operations such as solving large linear systems.We present a fast algorithm designed to approximate $f(A)$, where $A$ represents a Hermitian HSS matrix. We use a telescopic decomposition of $A$, inspired by the recent work of Levitt and Martinsson, allowing us to approximate $f(A)$ by recursively performing lowrank updates with block rational Krylov subspaces while keeping the size of the matrices involved in the rational Krylov subspaces small. In particular, no largescale linear system needs to be solved, yielding favorable complexity estimates and reduced execution times compared to existing methods.Finally, we develop a memoryefficient algorithm for computing $f(A)C$, where $A$ is a Hermitian matrix (not necessarily HSS), and $C$ is a block vector. Our approach combines a block Lanczos algorithm with a basis compression technique based on block rational Krylov subspaces involving only small matrices, enabling us to avoid storing the entire Lanczos basis, resulting in significant reductions in memory usage. Theoretical results demonstrate that, for a wide variety of functions, the proposed algorithm differs from block Lanczos by an error term that is typically negligible.File  Dimensione  Formato  

Tesi.pdf
accesso aperto
Dimensione
734.18 kB
Formato
Adobe PDF

734.18 kB  Adobe PDF  Visualizza/Apri 
I documenti in UNITESI sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.
https://hdl.handle.net/20.500.14242/161361
URN:NBN:IT:SNS161361