Date: | Wed, December 1, 2010 |
Time: | 11:15 |
Place: | Research I Seminar Room |
Abstract: The Johnson-Lindenstrauss (JL) Lemma states that any set of p points in high dimensional Euclidean space can be embedded into O(δ-2 log(p)) dimensions, without distorting the distance between any two points by more than a factor between 1-δ and 1+δ. We establish a new connection between the JL Lemma and the Restricted Isometry Property (RIP), a well-known concept in the theory of sparse recovery often used for showing the success of l1-minimization.
Consider an m by m matrix satisfying the (k, δk)-RIP with randomized column signs and an arbitrary set E of O(ek) points in RN. We show that with high probability, such a matrix with randomized column signs maps E into Rm without distorting the distance between any two points by more than a factor between 1-4δk and 1+4δk. Consequently, matrices satisfying the Restricted Isometry of optimal order provide optimal Johnson-Lindenstrauss embeddings up to a logarithmic factor in N. Moreover, our results yield the best known bounds on the necessary embedding dimension m for a wide class of structured random matrices. In particular, for partial Fourier and partial Hadamard matrices, our method optimizes the dependence of m on the distortion δ: We improve the recent bound m=O(δ-4 log(p) log4(N)) of Ailon and Liberty (2010) to m=O(δ-2 log(p) log4(N)), which is optimal up to the logarithmic factors in N. Our results also have a direct application in the area of compressed sensing for redundant dictionaries.
This is joint work with Rachel Ward.