Session A1 - Approximation Theory
July 10, 15:30 ~ 16:20
Goedel and Turing in approximation theory - On sampling, reconstruction, sparsity and computational barriers
University of Cambridge, United Kingdom - firstname.lastname@example.org
The problem of approximating a function from a finite set of samples of linear measurements has been a core part of approximation theory over the last decades. The key issue is to design a decoder that takes the samples and produces an approximation to the function in an accurate and robust way. When the decoder is designed, one is then faced with the problem of finding a robust algorithm that can compute the output of the decoder or at least an approximation to it. And herein lies the crucial question: do such algorithms always exist? For linear decoders the answer is always yes, however, for non-linear decoders we show that the answer is often no. Moreover, and perhaps surprisingly, the answer is no for many of the popular non-linear decoders that are used in practice. In fact, a positive answer to the question would imply decidability of famous non-decidable problems such as the Halting problem, and hence the problems are formally non-computable. In this talk I will discuss this paradox and demonstrate how, by adding sparsity as a key ingredient, one can assure the existence of robust algorithms in non-linear approximation theory, and explain the success of their use in practice. Moreover, I will show how the above paradox opens up for a rich classification theory of non-linear decoders that contains many surprises.
Joint work with Alex Bastounis (University of Cambridge), Laura Terhaar (University of Cambridge) and Verner Vlacic (University of Cambridge).