Conference abstracts

Session B4 - Learning Theory

July 14, 15:30 ~ 15:55

Parallelizing Spectral Algorithms for Kernel Learning

Gilles Blanchard

University of Potsdam, Germany   -   gilles.blanchard@math.uni-potsdam.de

We consider a distributed learning approach in supervised learning for a large class of spectral regularization methods in a reproducing kernel Hilbert space framework. The data set of size $n$ is partitioned into $m=O(n^\alpha)$ disjoint subsets which can be sent to different machines. On each subset, some spectral regularization method (belonging to a large class, including in particular Kernel Ridge Regression, $L^2$-boosting and spectral cut-off) is applied. The regression function $f$ is then estimated via simple averaging. This leads to a substantial reduction in computation time with respect to using a single machine. We show that minimax optimal rates of convergence are preserved if $m$ grows sufficiently slowly with $n$, the upper limit on the exponent $\alpha$ depending on the smoothness assumptions on $f$ and the intrinsic dimensionality.

Joint work with Nicole Mücke.

View abstract PDF



FoCM 2017, based on a nodethirtythree design.