Conference abstracts

Session B4 - Learning Theory

July 14, 17:30 ~ 17:55

Projected gradient descent with nonconvex constraints

Rina Barber

University of Chicago, USA   -   rina@uchicago.edu

Nonconvex optimization arises in many applications of high-dimensional statistics and data analysis, where data models and regularization terms can both often exhibit nonconvexity. While convex programs for structured signal recovery have been widely studied, comparatively little is known about the theoretical properties of nonconvex optimization methods. In this talk I will discuss the problem of projected gradient descent over nonconvex constraints, where the local geometry of the constraint set is closely tied to its convergence behavior. By measuring the local concavity of the constraint set, we can give concrete guarantees for convergence of projected gradient descent. Furthermore, by relaxing these geometric conditions, we can allow for approximate calculation of the projection step to speed up the algorithm.

Joint work with Wooseok Ha.

View abstract PDF



FoCM 2017, based on a nodethirtythree design.