Hutch++: Optimal stochastic trace estimation

RA Meyer, C Musco, C Musco, DP Woodruff - Symposium on Simplicity in …, 2021 - SIAM
We study the problem of estimating the trace of a matrix A that can only be accessed through
matrix-vector multiplication. We introduce a new randomized algorithm, Hutch++, which …

Black-box control for linear dynamical systems

X Chen, E Hazan - Conference on Learning Theory, 2021 - proceedings.mlr.press
We consider the problem of black-box control: the task of controlling an unknown linear time-
invariant dynamical system from a single trajectory without a stabilizing controller. Under the …

Efficient convex optimization requires superlinear memory

A Marsden, V Sharan, A Sidford… - … on Learning Theory, 2022 - proceedings.mlr.press
We show that any memory-constrained, first-order algorithm which minimizes $ d $-
dimensional, $1 $-Lipschitz convex functions over the unit ball to $1/\mathrm {poly}(d) …

Query lower bounds for log-concave sampling

S Chewi, J de Dios Pont, J Li, C Lu, S Narayanan - Journal of the ACM, 2024 - dl.acm.org
Log-concave sampling has witnessed remarkable algorithmic advances in recent years, but
the corresponding problem of proving lower bounds for this task has remained elusive, with …

Low-rank approximation with 1/𝜖1/3 matrix-vector products

A Bakshi, KL Clarkson, DP Woodruff - … of the 54th Annual ACM SIGACT …, 2022 - dl.acm.org
We study iterative methods based on Krylov subspaces for low-rank approximation under
any Schatten-p norm. Here, given access to a matrix A through matrix-vector products, an …

Optimal eigenvalue approximation via sketching

W Swartworth, DP Woodruff - Proceedings of the 55th Annual ACM …, 2023 - dl.acm.org
Given a symmetric matrix A, we show from the simple sketch GAG T, where G is a Gaussian
matrix with k= O (1/є2) rows, that there is a procedure for approximating all eigenvalues of A …

Querying a matrix through matrix-vector products

X Sun, DP Woodruff, G Yang, J Zhang - ACM Transactions on Algorithms …, 2021 - dl.acm.org
We consider algorithms with access to an unknown matrix M ε F n× d via matrix-vector
products, namely, the algorithm chooses vectors v1, ⃛, vq, and observes Mv1, ⃛, Mvq. Here …

Memory-query tradeoffs for randomized convex optimization

X Chen, B Peng - 2023 IEEE 64th Annual Symposium on …, 2023 - ieeexplore.ieee.org
We show that any randomized first-order algorithm which minimizes a d-dimensional, 1-
Lipschitz convex function over the unit ball must either use Ω\left(d^2-δ\right) bits of memory …

Krylov methods are (nearly) optimal for low-rank approximation

A Bakshi, S Narayanan - 2023 IEEE 64th Annual Symposium …, 2023 - ieeexplore.ieee.org
We consider the problem of rank-1 low-rank approximation (LRA) in the matrix-vector
product model under various Schatten norms: _ ‖ u ‖ _ 2= 1\left ‖ A\left (Iu u …

Vector-matrix-vector queries for solving linear algebra, statistics, and graph problems

C Rashtchian, DP Woodruff, H Zhu - arXiv preprint arXiv:2006.14015, 2020 - arxiv.org
We consider the general problem of learning about a matrix through vector-matrix-vector
queries. These queries provide the value of $\boldsymbol {u}^{\mathrm {T}}\boldsymbol …