Conference abstracts

Session B4 - Learning Theory

July 14, 14:30 ~ 14:55

Random Moments for Compressive Statistical Learning

Rémi Gribonval

Inria - Rennes, France   -   remi.gribonval@inria.fr

We describe a general framework, {\em sketched statistical learning}, for memory-efficient large-scale learning. A low-dimensional {\em sketch} (a vector of random empirical generalized moments), that captures the information relevant to the considered learning task, is computed in one pass on the training collection. A near-minimizer of the risk is computed from the sketch using the generalized method of moments.

The framework is illustrated on sketched clustering, and sketched PCA, using empirical algorithms inspired by sparse recovery algorithms used in compressive sensing.

We provide theoretical guarantess to control the resulting generalization error, with sketch of size driven by an intrinsic measure of complexity of the learning task, which is independent of the volume of the training collection. Volume reductions of several orders of magnitude are demonstrated while preserving the overall learning quality.

Joint work with Nicolas Keriven (Université de Rennes 1, France), Yann Traonmilin (Inria - Rennes, France) and Gilles Blanchard (Universität Potsdam, Germany).

View abstract PDF



FoCM 2017, based on a nodethirtythree design.