Data underlying the publication: Transformer Models for Quantum Gate Set Tomography

DOI:10.4121/546d1887-4ff2-4653-8828-7eb704ee3840.v1
The DOI displayed above is for this specific version of this dataset, which is currently the latest. Newer versions may be published in the future. For a link that will always point to the latest version, please use
DOI: 10.4121/546d1887-4ff2-4653-8828-7eb704ee3840

Datacite citation style

Yu, King Yiu; Sarkar, Aritra (2025): Data underlying the publication: Transformer Models for Quantum Gate Set Tomography. Version 1. 4TU.ResearchData. dataset. https://doi.org/10.4121/546d1887-4ff2-4653-8828-7eb704ee3840.v1
Other citation styles (APA, Harvard, MLA, Vancouver, Chicago, IEEE) available at Datacite

Dataset

This repository hosts the accompanying software for the following research article.


Research article: Transformer Models for Quantum Gate Set Tomography

Abstract:

Quantum computation represents a promising frontier in the domain of high-performance computing, blending quantum information theory with practical applications to overcome the limitations of classical computation. This study investigates the challenges of manufacturing high-fidelity and scalable quantum processors. Quantum gate set tomography (QGST) is a critical method for characterizing quantum processors and understanding their operational capabilities and limitations. This paper introduces ML4QGST as a novel approach to QGST by integrating machine learning techniques, specifically utilizing a transformer neural network model. Adapting the transformer model for QGST addresses the computational complexity of modeling quantum systems. Advanced training strategies, including data grouping and curriculum learning, are employed to enhance model performance, demonstrating significant congruence with ground-truth values. We benchmark this training pipeline on the constructed learning model, to successfully perform QGST for 3 gates on a 1 qubit system with over-rotation error and depolarizing noise estimation with comparable accuracy to pyGSTi. This research marks a pioneering step in applying deep neural networks to the complex problem of quantum gate set tomography, showcasing the potential of machine learning to tackle nonlinear tomography challenges in quantum computing.


Citation:

@article{yu2024transformer, title={Transformer Models for Quantum Gate Set Tomography}, author={Yu, King Yiu and Sarkar, Aritra and Ishihara, Ryoichi and Feld, Sebastian}, journal={arXiv preprint arXiv:2405.02097}, year={2024} }


History

  • 2025-09-11 first online, published, posted

Publisher

4TU.ResearchData

Format

txt, ipynb

Associated peer-reviewed publication

Transformer Models for Quantum Gate Set Tomography

Organizations

TU Delft, Faculty of Electrical Engineering, Mathematics and Computer Science, Department of Quantum and Computer Engineering

DATA

Files (1)