英国牛津大学Zhenyu Cai研究团队近日取得一项新成果。他们的最新研究提出了利用经典阴影进行高维子空间扩展。相关论文于2025年2月13日发表在《物理评论A》杂志上。
该研究介绍了一种经典阴影测量数据的后处理技术,通过高维子空间展开提高基态估计的精度。维数仅受经典后处理资源数量的限制,而不受量子资源的限制。他们的方法的关键步骤是从阴影数据中有效地识别主题可观测值,然后是他们的正则化子空间扩展,即使在主题噪声数据时也是数值稳定的。
研究组分析了噪声在其方法中的传播,并给出了在经典阴影中,由于快照数量有限而产生的统计波动的上界。在数值模拟中,他们的方法可以在许多情况下实现能量估计误差的减小,有时甚至可以减少一个数量级以上。研究小组还证明了它们的性能改进,与状态制备电路中的相干误差(不良初始状态)和门噪声无关。
此外,性能至少与不使用额外量子资源的直接能量估计一样好,在许多情况下甚至更好,并且该方法是直接从经典阴影数据,估计基态能量的非常自然的替代方法。
附:英文原文
Title: High-dimensional subspace expansion using classical shadows
Author: Gregory Boyd, Bálint Koczor, Zhenyu Cai
Issue&Volume: 2025/02/13
Abstract: We introduce a postprocessing technique for classical shadow measurement data that enhances the precision of ground state estimation through high-dimensional subspace expansion; the dimensionality is only limited by the amount of classical postprocessing resources rather than by quantum resources. Crucial steps of our approach are the efficient identification of useful observables from shadow data, followed by our regularized subspace expansion that is designed to be numerically stable even when using noisy data. We analytically investigate noise propagation within our method, and upper bound the statistical fluctuations due to the limited number of snapshots in classical shadows. In numerical simulations, our method can achieve a reduction in the energy estimation errors in many cases, sometimes by more than an order of magnitude. We also demonstrate that our performance improvements are robust against both coherent errors (bad initial state) and gate noise in the state-preparation circuits. Furthermore, performance is guaranteed to be at least as good—and in many cases better—than direct energy estimation without using additional quantum resources, and the approach is thus a very natural alternative for estimating ground state energies directly from classical shadow data.
DOI: 10.1103/PhysRevA.111.022423
Source: https://journals.aps.org/pra/abstract/10.1103/PhysRevA.111.022423
Physical Review A:《物理评论A》,创刊于1970年。隶属于美国物理学会,最新IF:2.97
官方网址:https://journals.aps.org/pra/
投稿链接:https://authors.aps.org/Submissions/login/new