Recent developments in boolean matrix factorization

P Miettinen, S Neumann - arXiv preprint arXiv:2012.03127, 2020 - arxiv.org
The goal of Boolean Matrix Factorization (BMF) is to approximate a given binary matrix as
the product of two low-rank binary factor matrices, where the product of the factor matrices is …

The why and how of nonnegative matrix factorization

N Gillis - … , optimization, kernels, and support vector machines, 2014 - books.google.com
Nonnegative matrix factorization (NMF) has become a widely used tool for the analysis of
high-dimensional data as it automatically extracts sparse and meaningful features from a set …

[图书][B] Nonnegative matrix factorization

N Gillis - 2020 - SIAM
Identifying the underlying structure of a data set and extracting meaningful information is a
key problem in data analysis. Simple and powerful methods to achieve this goal are linear …

Coresets for clustering in euclidean spaces: importance sampling is nearly optimal

L Huang, NK Vishnoi - Proceedings of the 52nd Annual ACM SIGACT …, 2020 - dl.acm.org
Given a collection of n points in ℝ d, the goal of the (k, z)-clustering problem is to find a
subset of k “centers” that minimizes the sum of the z-th powers of the Euclidean distance of …

Dynamic tensor product regression

A Reddy, Z Song, L Zhang - Advances in Neural …, 2022 - proceedings.neurips.cc
In this work, we initiate the study of\emph {Dynamic Tensor Product Regression}. One has
matrices $ A_1\in\mathbb {R}^{n_1\times d_1},\ldots, A_q\in\mathbb {R}^{n_q\times d_q} …

New subset selection algorithms for low rank approximation: Offline and online

DP Woodruff, T Yasuda - Proceedings of the 55th Annual ACM …, 2023 - dl.acm.org
Subset selection for the rank k approximation of an n× d matrix A offers improvements in the
interpretability of matrices, as well as a variety of computational savings. This problem is well …

Optimal sketching for kronecker product regression and low rank approximation

H Diao, R Jayaram, Z Song, W Sun… - Advances in neural …, 2019 - proceedings.neurips.cc
We study the Kronecker product regression problem, in which the design matrix is a
Kronecker product of two or more matrices. Formally, given $ A_i\in\R^{n_i\times d_i} $ for …

ISLET: Fast and optimal low-rank tensor regression via importance sketching

AR Zhang, Y Luo, G Raskutti, M Yuan - SIAM journal on mathematics of data …, 2020 - SIAM
In this paper, we develop a novel procedure for low-rank tensor regression, namely
Importance Sketching Low-rank Estimation for Tensors (ISLET). The central idea behind …

Low-rank approximation with 1/𝜖1/3 matrix-vector products

A Bakshi, KL Clarkson, DP Woodruff - … of the 54th Annual ACM SIGACT …, 2022 - dl.acm.org
We study iterative methods based on Krylov subspaces for low-rank approximation under
any Schatten-p norm. Here, given access to a matrix A through matrix-vector products, an …

Quantum-inspired algorithms from randomized numerical linear algebra

N Chepurko, K Clarkson, L Horesh… - International …, 2022 - proceedings.mlr.press
We create classical (non-quantum) dynamic data structures supporting queries for
recommender systems and least-squares regression that are comparable to their quantum …