#### Conference abstracts

Session B5 - Random Matrices

July 13, 17:30 ~ 17:55 - Room B3

## Optimality of the Johnson-Lindenstrauss lemma

### Harvard University, USA   -   minilek@seas.harvard.edu

One popular application of random matrices is the "random projections" tool offered by the Johnson-Lindenstrauss lemma. This lemma states that for any $n, d > 1$ and any $0<\varepsilon<1/2$, and for any $X\subset \ell_2^d$ of size $n$, there is a map $f:X\rightarrow \ell_2^m$ with $m = O(\varepsilon^{-2}\log n)$ with distortion $1+\varepsilon$. All known proofs of this lemma operate by defining $f(x) = \Pi x$ for some random matrix $\Pi$, with the distribution being used changing from proof to proof.

We show that this "random projection" method offered by the JL lemma is optimal for Euclidean dimensionality reduction. That is, for every $n,d,\varepsilon$ as above where $\varepsilon$ is not too small, there exists a set $X\subset \ell_2^d$ with $|X| = n$ such that any $1+\varepsilon$ distortion of embedding (even non-linear ones) of $X$ into $\ell_2^m$ must have $m = \Omega(\varepsilon^{-2}\log n)$.

Joint work with Kasper Green Larhus (Aarhus University).

FoCM 2017, based on a nodethirtythree design.