Conference abstracts

Session B4 - Learning Theory

July 14, 16:00 ~ 16:25

Matrix Completion, Saddlepoints, and Gradient Descent

Jason Lee

University of Southern California, USA   -   jasondlee88@gmail.com

Matrix completion is a fundamental machine learning problem with wide applications in collaborative filtering and recommender systems. Typically, matrix completion are solved by non-convex optimization procedures, which are empirically extremely successful. We prove that the symmetric matrix completion problem has no spurious local minima, meaning all local minima are also global. Thus the matrix completion objective has only saddlepoints an global minima.

Next, we show that saddlepoints are easy to avoid for even Gradient Descent -- arguably the simplest optimization procedure. We prove that with probability 1, randomly initialized Gradient Descent converges to a local minimizer. The same result holds for a large class of optimization algorithms including proximal point, mirror descent, and coordinate descent.

Joint work with Michael Jordan (UC Berkeley), Benjamin Recht (UC Berkeley), Max Simchowitz (UC Berkeley), Rong Ge (Duke University) and Tengyu Ma ( Princeton University).

View abstract PDF



FoCM 2017, based on a nodethirtythree design.