Publication
STOC 2013
Conference paper

Low rank approximation and regression in input sparsity time

View publication

Abstract

We design a new distribution over poly(rε-1) × n matrices S so that for any fixed n×d matrix A of rank r, with probability at least 9/10, SAx 2 = (1± ε) Ax 2 simultaneously for all x ∈ Rd. Such a matrix S is called a subspace embedding. Furthermore, SA can be computed in O(nnz(A))time, where nnz(A) is the number of non-zero entries of A. This improves over all previous subspace embeddings, which required at least Ω(nd log d) time to achieve this property. We call our matrices S sparse embedding matrices. Using our sparse embedding matrices, we obtain the fastest known algorithms for overconstrained least-squares regression, low-rank approximation, approximating all leverage scores, and p-regression: • to output an xfor which Ax - b 2 ≤ (1 + ε)min x Ax - b 2 for an n × d matrix A and an n × 1 column vector b, we obtain an algorithm running in O(nnz(A)) + ̃O (d3ε-2) time, and another in O(nnz(A) log(1/ε)) + ̃O (d3 log(1/ε)) time. (Here ̃O(f) = f · logO(1)(f).) • to obtain a decomposition of an n×n matrix A into a product of an n×k matrix L, a k ×k diagonal matrix D, and a n × k matrix W, for which A - LDW F ≤ (1 + ε) A - Ak F , where Ak is the best rank-k approximation, our algorithm runs in O(nnz(A)) +̃O(nk2ε-4 log n+k3ε-5 log 2 n) time. • to output an approximation to all leverage scores of an n×d input matrix A simultaneously, with constant relative error, our algorithms run in O(nnz(A) log n)+ ̃O (r3) time. • to output an x for which Ax - b p ≤ (1 + ε)min x Ax - b p for an n × d matrix A and an n × 1 column vector b, we obtain an algorithm running in O(nnz(A) log n) + poly(rε-1) time, for any constant 1 ≤ p < ∞. We optimize the polynomial factors in the above stated running times, and show various tradeoffs. Finally, we provide preliminary experimental results which suggest that our algorithms are of interest in practice. Copyright 2013 ACM.

Date

11 Jul 2013

Publication

STOC 2013

Authors

Share