Session A1 - Approximation Theory - Semi-plenary talk
July 11, 14:30 ~ 15:20 - Room B3
Multiscale Methods for Dictionary Learning, Regression and Optimal Transport for data near low-dimensional sets
Johns Hopkins University, United States of America - email@example.com
We discuss a family of ideas, algorithms, and results for analyzing various new and classical problems in the analysis of high-dimensional data sets. These methods we discuss perform well when data is (nearly) intrinsically low-dimensional. They rely on the idea of performing suitable multiscale geometric decompositions of the data, and exploiting such decompositions to perform a variety of tasks in signal processing and statistical learning. In particular, we discuss the problem of dictionary learning, where one is interested in constructing, given a training set of signals, a set of vectors (dictionary) such that the signals admit a sparse representation in terms of the dictionary vectors. We then discuss the problem of regressing a function on a low-dimensional unknown manifold. For both problems we introduce a multiscale estimator, fast algorithms for constructing it, and give finite sample guarantees for its performance, and discuss its optimality. Finally, we discuss an application of these multiscale decompositions to the fast calculation of optimal transportation plans, introduce a multiscale version of optimal transportation distances, and discuss preliminary applications.
Joint work with Sam Gerber (University of Oregon), Wenjing Liao (Johns Hopkins University), Stefano Vigogna (Johns Hopkins University).