Small random initialization is akin to spectral learning: Optimization and generalization guarantees for overparameterized low-rank matrix reconstruction
D Stöger, M Soltanolkotabi - Advances in Neural …, 2021 - proceedings.neurips.cc
Recently there has been significant theoretical progress on understanding the convergence
and generalization of gradient-based methods on nonconvex losses with overparameterized …
and generalization of gradient-based methods on nonconvex losses with overparameterized …
[HTML][HTML] Low rank tensor recovery via iterative hard thresholding
H Rauhut, R Schneider, Ž Stojanac - Linear Algebra and its Applications, 2017 - Elsevier
We study extensions of compressive sensing and low rank matrix recovery (matrix
completion) to the recovery of low rank tensors of higher order from a small number of linear …
completion) to the recovery of low rank tensors of higher order from a small number of linear …
Guarantees of Riemannian optimization for low rank matrix recovery
We establish theoretical recovery guarantees of a family of Riemannian optimization
algorithms for low rank matrix recovery, which is about recovering an m*n rank r matrix from …
algorithms for low rank matrix recovery, which is about recovering an m*n rank r matrix from …
Fast state tomography with optimal error bounds
Projected least squares is an intuitive and numerically cheap technique for quantum state
tomography: compute the least-squares estimator and project it onto the space of states. The …
tomography: compute the least-squares estimator and project it onto the space of states. The …
Compressive statistical learning with random feature moments
We describe a general framework—compressive statistical learning—for resourceefficient
large-scale learning: the training collection is compressed in one pass into a …
large-scale learning: the training collection is compressed in one pass into a …
Gradient descent for deep matrix factorization: Dynamics and implicit bias towards low rank
In deep learning, it is common to use more network parameters than training points. In such
scenario of over-parameterization, there are usually multiple networks that achieve zero …
scenario of over-parameterization, there are usually multiple networks that achieve zero …
Recovering quantum gates from few average gate fidelities
Characterizing quantum processes is a key task in the development of quantum
technologies, especially at the noisy intermediate scale of today's devices. One method for …
technologies, especially at the noisy intermediate scale of today's devices. One method for …
Gridless line spectrum estimation and low-rank Toeplitz matrix compression using structured samplers: A regularization-free approach
This paper considers the problem of compressively sampling wide sense stationary random
vectors with a low rank Toeplitz covariance matrix. Certain families of structured …
vectors with a low rank Toeplitz covariance matrix. Certain families of structured …
Robust nonnegative sparse recovery and the nullspace property of 0/1 measurements
We investigate recovery of nonnegative vectors from non-adaptive compressive
measurements in the presence of noise of unknown power. In the absence of noise, existing …
measurements in the presence of noise of unknown power. In the absence of noise, existing …
Mixing properties of stochastic quantum Hamiltonians
Random quantum processes play a central role both in the study of fundamental mixing
processes in quantum mechanics related to equilibration, thermalisation and fast scrambling …
processes in quantum mechanics related to equilibration, thermalisation and fast scrambling …