Skip to yearly menu bar Skip to main content


Poster

Matrix Completion with ReLU Sampling

Huikang Liu · Peng Wang · Longxiu Huang · Qing Qu · Laura Balzano


Abstract: We study the problem of symmetric low-rank matrix completion (MC) with deterministic entry-dependent sampling. In particular, we assume rectified linear unit (ReLU) sampling, where only positive entries are observed. We first demonstrate empirically that the landscape of this MC problem is not globally benign: Gradient descent with random initialization will generally converge to stationary points that are not globally optimal. We then prove that when the matrix factor with rank less than $O(\log n)$ satisfies novel assumptions related to but more general than those in the deterministic MC literature, the nonconvex objective function is geodesically strongly convex on the manifold in a neighborhood of a planted low-rank matrix. We show that our assumptions are satisfied by a matrix factor with iid Gaussian entries. We demonstrate that a tailor-designed initialization empirically always achieves convergence to the global minima. Finally, we conduct extensive experiments and compare MC methods in this setting, investigating completion with respect to initialization, noise level, dimension, and rank.

Live content is unavailable. Log in and register to view live content