Hutch++: Optimal stochastic trace estimation
We study the problem of estimating the trace of a matrix A that can only be accessed through
matrix-vector multiplication. We introduce a new randomized algorithm, Hutch++, which …
matrix-vector multiplication. We introduce a new randomized algorithm, Hutch++, which …
Black-box control for linear dynamical systems
We consider the problem of black-box control: the task of controlling an unknown linear time-
invariant dynamical system from a single trajectory without a stabilizing controller. Under the …
invariant dynamical system from a single trajectory without a stabilizing controller. Under the …
Efficient convex optimization requires superlinear memory
We show that any memory-constrained, first-order algorithm which minimizes $ d $-
dimensional, $1 $-Lipschitz convex functions over the unit ball to $1/\mathrm {poly}(d) …
dimensional, $1 $-Lipschitz convex functions over the unit ball to $1/\mathrm {poly}(d) …
Query lower bounds for log-concave sampling
Log-concave sampling has witnessed remarkable algorithmic advances in recent years, but
the corresponding problem of proving lower bounds for this task has remained elusive, with …
the corresponding problem of proving lower bounds for this task has remained elusive, with …
Low-rank approximation with 1/𝜖1/3 matrix-vector products
We study iterative methods based on Krylov subspaces for low-rank approximation under
any Schatten-p norm. Here, given access to a matrix A through matrix-vector products, an …
any Schatten-p norm. Here, given access to a matrix A through matrix-vector products, an …
Optimal eigenvalue approximation via sketching
W Swartworth, DP Woodruff - Proceedings of the 55th Annual ACM …, 2023 - dl.acm.org
Given a symmetric matrix A, we show from the simple sketch GAG T, where G is a Gaussian
matrix with k= O (1/є2) rows, that there is a procedure for approximating all eigenvalues of A …
matrix with k= O (1/є2) rows, that there is a procedure for approximating all eigenvalues of A …
Querying a matrix through matrix-vector products
We consider algorithms with access to an unknown matrix M ε F n× d via matrix-vector
products, namely, the algorithm chooses vectors v1, ⃛, vq, and observes Mv1, ⃛, Mvq. Here …
products, namely, the algorithm chooses vectors v1, ⃛, vq, and observes Mv1, ⃛, Mvq. Here …
Memory-query tradeoffs for randomized convex optimization
X Chen, B Peng - 2023 IEEE 64th Annual Symposium on …, 2023 - ieeexplore.ieee.org
We show that any randomized first-order algorithm which minimizes a d-dimensional, 1-
Lipschitz convex function over the unit ball must either use Ω\left(d^2-δ\right) bits of memory …
Lipschitz convex function over the unit ball must either use Ω\left(d^2-δ\right) bits of memory …
Krylov methods are (nearly) optimal for low-rank approximation
A Bakshi, S Narayanan - 2023 IEEE 64th Annual Symposium …, 2023 - ieeexplore.ieee.org
We consider the problem of rank-1 low-rank approximation (LRA) in the matrix-vector
product model under various Schatten norms: _ ‖ u ‖ _ 2= 1\left ‖ A\left (Iu u …
product model under various Schatten norms: _ ‖ u ‖ _ 2= 1\left ‖ A\left (Iu u …
Vector-matrix-vector queries for solving linear algebra, statistics, and graph problems
We consider the general problem of learning about a matrix through vector-matrix-vector
queries. These queries provide the value of $\boldsymbol {u}^{\mathrm {T}}\boldsymbol …
queries. These queries provide the value of $\boldsymbol {u}^{\mathrm {T}}\boldsymbol …