Concepedia

Publication | Open Access

Randomized compiling for scalable quantum computing on a noisy\n superconducting quantum processor

130

Citations

30

References

2020

Year

Abstract

The successful implementation of algorithms on quantum processors relies on\nthe accurate control of quantum bits (qubits) to perform logic gate operations.\nIn this era of noisy intermediate-scale quantum (NISQ) computing, systematic\nmiscalibrations, drift, and crosstalk in the control of qubits can lead to a\ncoherent form of error which has no classical analog. Coherent errors severely\nlimit the performance of quantum algorithms in an unpredictable manner, and\nmitigating their impact is necessary for realizing reliable quantum\ncomputations. Moreover, the average error rates measured by randomized\nbenchmarking and related protocols are not sensitive to the full impact of\ncoherent errors, and therefore do not reliably predict the global performance\nof quantum algorithms, leaving us unprepared to validate the accuracy of future\nlarge-scale quantum computations. Randomized compiling is a protocol designed\nto overcome these performance limitations by converting coherent errors into\nstochastic noise, dramatically reducing unpredictable errors in quantum\nalgorithms and enabling accurate predictions of algorithmic performance from\nerror rates measured via cycle benchmarking. In this work, we demonstrate\nsignificant performance gains under randomized compiling for the four-qubit\nquantum Fourier transform algorithm and for random circuits of variable depth\non a superconducting quantum processor. Additionally, we accurately predict\nalgorithm performance using experimentally-measured error rates. Our results\ndemonstrate that randomized compiling can be utilized to leverage and predict\nthe capabilities of modern-day noisy quantum processors, paving the way forward\nfor scalable quantum computing.\n

References

YearCitations

Page 1