This thesis presents original contributions at the intersection of quantum computing and machine learning, investigating how these two fields can benefit from each other. The first part of the thesis presents two applications of machine learning techniques in quantum computing. The first contribution involves the use of artificial neural networks for the decoding of quantum error correction codes, resulting in faster decoding times while maintaining a high level of accuracy. The originality of this work lies in the application of explainable machine learning techniques to both improve the decoding performance and to provide insights into the decoding process. The second contribution introduces a novel technique based on reinforcement learning to characterize and simulate the noise affecting a quantum chip. This technique reduces the heuristic assumption on the noise model, making it more adaptable to the specific noise characteristics of the quantum device. In the second part of the thesis, we demonstrate, with two examples, how it is possible to implement machine learning algorithms on quantum devices by replacing artificial neural networks with parametrized quantum circuits. The first contribution in this section introduces a quantum anomaly detection algorithm, applied to the field of high-energy physics. This algorithm allows for the identification of anomalous patterns in quantum data, maintaining a level of accuracy comparable to classical algorithms. The novelty of this work lies in the first use of quantum circuits for the task of anomaly detection in a muon drift chamber trigger system, with possible future applications to quantum detector systems in high-energy physics experiments. The second contribution presents the quantum version of a generative diffusion model, able to sample quantum states from a well-defined distribution. This first implementation of a quantum diffusion model is not meant to outperform classical models, but rather to show potential advantages and limitations of these kinds of algorithms. The aim of the thesis is to show how quantum computing can benefit from machine learning, and how machine learning can be implemented on quantum devices, setting the stage for future advancements in both fields.
Machine learning and quantum computing
BORDONI, SIMONE
2025
Abstract
This thesis presents original contributions at the intersection of quantum computing and machine learning, investigating how these two fields can benefit from each other. The first part of the thesis presents two applications of machine learning techniques in quantum computing. The first contribution involves the use of artificial neural networks for the decoding of quantum error correction codes, resulting in faster decoding times while maintaining a high level of accuracy. The originality of this work lies in the application of explainable machine learning techniques to both improve the decoding performance and to provide insights into the decoding process. The second contribution introduces a novel technique based on reinforcement learning to characterize and simulate the noise affecting a quantum chip. This technique reduces the heuristic assumption on the noise model, making it more adaptable to the specific noise characteristics of the quantum device. In the second part of the thesis, we demonstrate, with two examples, how it is possible to implement machine learning algorithms on quantum devices by replacing artificial neural networks with parametrized quantum circuits. The first contribution in this section introduces a quantum anomaly detection algorithm, applied to the field of high-energy physics. This algorithm allows for the identification of anomalous patterns in quantum data, maintaining a level of accuracy comparable to classical algorithms. The novelty of this work lies in the first use of quantum circuits for the task of anomaly detection in a muon drift chamber trigger system, with possible future applications to quantum detector systems in high-energy physics experiments. The second contribution presents the quantum version of a generative diffusion model, able to sample quantum states from a well-defined distribution. This first implementation of a quantum diffusion model is not meant to outperform classical models, but rather to show potential advantages and limitations of these kinds of algorithms. The aim of the thesis is to show how quantum computing can benefit from machine learning, and how machine learning can be implemented on quantum devices, setting the stage for future advancements in both fields.| File | Dimensione | Formato | |
|---|---|---|---|
|
Tesi_dottorato_Bordoni.pdf
accesso aperto
Licenza:
Tutti i diritti riservati
Dimensione
11.14 MB
Formato
Adobe PDF
|
11.14 MB | Adobe PDF | Visualizza/Apri |
I documenti in UNITESI sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.
https://hdl.handle.net/20.500.14242/303840
URN:NBN:IT:UNIROMA1-303840