Rigor with machine learning from field theory to the Poincaré conjecture
S Gukov, J Halverson, F Ruehle - Nature Reviews Physics, 2024 - nature.com
Despite their successes, machine learning techniques are often stochastic, error-prone and
blackbox. How could they then be used in fields such as theoretical physics and pure …
blackbox. How could they then be used in fields such as theoretical physics and pure …
Towards understanding grokking: An effective theory of representation learning
We aim to understand grokking, a phenomenon where models generalize long after
overfitting their training set. We present both a microscopic analysis anchored by an effective …
overfitting their training set. We present both a microscopic analysis anchored by an effective …
Representation learning via quantum neural tangent kernels
Variational quantum circuits are used in quantum machine learning and variational quantum
simulation tasks. Designing good variational circuits or predicting how well they perform for …
simulation tasks. Designing good variational circuits or predicting how well they perform for …
Bootstrability in line-defect CFTs with improved truncation methods
V Niarchos, C Papageorgakis, P Richmond… - Physical Review D, 2023 - APS
We study the conformal bootstrap of 1D CFTs on the straight Maldacena–Wilson line in 4D
N= 4 super-Yang–Mills theory. We introduce an improved truncation scheme with an “OPE …
N= 4 super-Yang–Mills theory. We introduce an improved truncation scheme with an “OPE …
Exact marginal prior distributions of finite Bayesian neural networks
J Zavatone-Veth, C Pehlevan - Advances in Neural …, 2021 - proceedings.neurips.cc
Bayesian neural networks are theoretically well-understood only in the infinite-width limit,
where Gaussian priors over network weights yield Gaussian priors over network outputs …
where Gaussian priors over network weights yield Gaussian priors over network outputs …
Neural network field theories: non-Gaussianity, actions, and locality
Both the path integral measure in field theory (FT) and ensembles of neural networks (NN)
describe distributions over functions. When the central limit theorem can be applied in the …
describe distributions over functions. When the central limit theorem can be applied in the …
Asymptotics of representation learning in finite Bayesian neural networks
J Zavatone-Veth, A Canatar… - Advances in neural …, 2021 - proceedings.neurips.cc
Recent works have suggested that finite Bayesian neural networks may sometimes
outperform their infinite cousins because finite networks can flexibly adapt their internal …
outperform their infinite cousins because finite networks can flexibly adapt their internal …
Contrasting random and learned features in deep Bayesian linear regression
JA Zavatone-Veth, WL Tong, C Pehlevan - Physical Review E, 2022 - APS
Understanding how feature learning affects generalization is among the foremost goals of
modern deep learning theory. Here, we study how the ability to learn representations affects …
modern deep learning theory. Here, we study how the ability to learn representations affects …
[图书][B] The Calabi–Yau Landscape: From Geometry, to Physics, to Machine Learning
YH He - 2021 - books.google.com
Can artificial intelligence learn mathematics? The question is at the heart of this original
monograph bringing together theoretical physics, modern geometry, and data science. The …
monograph bringing together theoretical physics, modern geometry, and data science. The …
Disorder averaging and its UV discontents
JJ Heckman, AP Turner, X Yu - Physical Review D, 2022 - APS
We present a stringy realization of quantum field theory ensembles in D≤ 4 spacetime
dimensions, thus realizing a disorder averaging over coupling constants. When each …
dimensions, thus realizing a disorder averaging over coupling constants. When each …