Harvard Physics 272 / CS 2233
This course covers quantum learning theory, a contemporary subject at the intersection of quantum mechanics, quantum computing, statistical learning theory, and machine learning. The core question of the subject is: how can we use quantum computation to efficiently learn properties of quantum mechanical systems? Answering this question helps us understand the power of quantum computers in assisting experimental physicists with studying quantum materials and quantum chemical systems, while also providing valuable tools for quantum machine learning to develop algorithms based on quantum data. Quantum learning theory has become a core subject in quantum information and computation, and this course is one of the first of its kind to present quantum learning theory in its entirety. We will explore the theory of learning quantum states and quantum dynamics, the theory of quantum memory and quantum replica-learning, random and pseudo-random quantum circuits, and many applications to quantum many-body physics.
Time/Location: MW 3-4:15, Pierce 209
Instructors: Sitan Chen, Jordan Cotler Office hours: Th 10:00-11:00, SEC 3.325; Tu 11-12, Goel 418
Teaching Fellows: Weiyuan Gong, Quynh Nguyen Office hours: M 5-6 pm, Pierce G7A; W 2-3, 52 Oxford St B150 Recitation: Th 4-5pm, Maxwell Dworkin G115
Canvas (for announcements) Ed discussion forum Course Overleaf
Course Policies: See syllabus for detailed overview.
| Topic | Link | Readings | |
|---|---|---|---|
| Lecture 1 (9/3) | Vignette: Learning an Unknown Rotation | - HKOT23: bootstrapping for learning arbitrary unitaries down to Heisenberg limit - GLM11: review paper on quantum metrology |
|
| Lecture 2 (9/8) | Classical probability and tensors | ||
| Lecture 3 (9/10) | Quantum mechanics basics I | ||
| Lecture 4 (9/15) | Quantum mechanics basics II | ||
| Lecture 5 (9/17) | Tomography I: single-copy measurements (entrywise error) | Bonus content: notes on tensor networks | |
| Lecture 6 (9/22) | Tomography II: single-copy measurements (operator norm) | - KRT14: original paper on tomography with single-copy random measurements - GKKT18: paper that algorithm from lecture is based on |
|
| Lecture 7 (9/24) | Tomography III: multi-copy measurements (representation theory basics, Schur-Weyl duality) | - OW15, HHJWY15: original papers achieving optimal rate for tomography - W16: PhD thesis of Wright with exposition on these methods |
|
| Lecture 8 (9/29) | Tomography IV: multi-copy measurements (Schur polynomials, pretty good measurement) | ||
| Lecture 9 (10/1) | Shadows I: Classical shadows | - CW19: overlapping quantum tomography (local observables) - HKP20: introduced classical shadows formalism |
|
| Lecture 10 (10/6) | Shadows II: Finishing classical shadows; online state learing | - A17: introduced shadow tomography - ACHKN18: online state learning |
|
| Lecture 11 (10/8) | Shadows III: Shadow tomography via threshold search | -BO20: state of the art guarantee for shadow tomography - BB22: matching bound via random sequential measurements |
|
| Lecture 12 (10/15) | Gibbs I: Introduction to Gibbs states | ||
| Lecture 13 (10/20) | Gibbs II: Learning high-temperature Gibbs states via cluster expansion | - HKT21: optimal high-temperature learning algorithm | |
| Lecture 14 (10/22) | Gibbs III: Controlling terms in the cluster expansion | ||
| Lecture 15 (10/27) | Gibbs IV: Newton-Raphson; Intro to low-temperature learning | pdf-a pdf-b |
- BLMT23: first efficient low-temperature learning algorithm - CAN25: efficient local low-temperature learning algorithm (this lecture) |
| Lecture 16 (10/29) | Gibbs V: Identifiability equation and regularization | - CKG23, CKBG: quantum Gibbs sampling papers introducing operator Fourier transform trick | |
| Lecture 17 (11/3) | Stabilizer states I: Clifford circuits and symplectic inner product | - GNW17: Section 2 is a good reference for Clifford / symplectic concepts - AG08: classical simulation of stabilizer circuits |
|
| Lecture 18 (11/5) | Stabilizer states II: Bell difference sampling and agnostic tomography | - M17: stabilizer learning algorithm - GKIL23: agnostic algorithm under large stabilizer fidelity (this lecture) - CGYZ24: agnostic algorithm in the general case |