Publication
APS March Meeting 2022
Conference paper

Representation Learning via Quantum Neural Tangent Kernels

View publication

Abstract

Variational quantum circuits are used in quantum machine learning and variational quantum simulation tasks. Designing good variational circuits or predicting how well they perform for given learning or optimization tasks is still unclear. In this paper, we address these problems, studying variational quantum circuits using the theory of neural tangent kernels. We define quantum neural tangent kernels, and derive the dynamical equation of their loss function in optimization and learning tasks. We define and analyze quantum neural tangent kernels in the frozen limit, where their variational angles change slowly and a linear perturbation of the variational angles is good enough to describe the dynamics, which is commonly known in machine learning as the lazy training regime. We then extend the analysis to a dynamical setting, including quadratic corrections in the variational angles. We define a large width limit for quantum kernels, showing that a hybrid quantum-classical neural network can be approximately Gaussian. Our results elucidate a regime in which an analytical understanding of the training dynamics for variational quantum circuits, used for quantum machine learning and optimization problems, is possible. *JL is supported in part by International Business Machines (IBM) Quantum through the Chicago Quantum Exchange, and the Pritzker School of Molecular Engineering at the University of Chicago through Liang Jiang's group.