Publication
ICML 2012
Conference paper

Efficient and practical stochastic subgradient descent for nuclear norm regularization

Abstract

We describe novel subgradient methods for a broad class of matrix optimization problems involving nuclear norm regularization. Unlike existing approaches, our method executes very cheap iterations by combining low-rank stochastic subgradients with efficient incremental SVD updates, made possible by highly optimized and parallelizable dense linear algebra operations on small matrices. Our practical algorithms always maintain a low-rank factorization of iterates that can be conveniently held in memory and efficiently multiplied to generate predictions in matrix completion settings. Empirical comparisons confirm that our approach is highly competitive with several recently proposed state-of-the-art solvers for such problems. Copyright 2012 by the author(s)/owner(s).

Date

10 Oct 2012

Publication

ICML 2012

Share