Sjoerd Dirksen | Dimensionality reduction with Johnson-Lindenstrauss embeddings
- https://wsc.project.cwi.nl/ml-reading-group/events/sjoerd-dirksen-tbd
- Sjoerd Dirksen | Dimensionality reduction with Johnson-Lindenstrauss embeddings
- 2017-04-10T14:30:00+02:00
- 2017-04-10T15:30:00+02:00
- Machine Learning Seminar by Sjoerd Dirksen from RWTH Aachen. Dirksen will talk about dimensionality reduction using random projections.
- When Apr 10, 2017 from 02:30 PM to 03:30 PM (Europe/Amsterdam / UTC200)
- Where L016
- Add event to calendar iCal
In modern data analysis, one frequently needs to deal with high-dimensional data sets. To alleviate the computational and storage issues that arise in handling this type of data, it is of interest to pre-process the data set to reduce its dimensionality, while preserving the information in the data that is vital for the computational task at hand.
In this talk I will consider dimensionality reduction with Johnson-Lindenstrauss embeddings. These random matrix constructions are designed to reduce the dimension while approximately preserving Euclidean inter-point distances in the data set. In particular, I will focus on the `fast’ and `sparse’ Johnson-Lindenstrauss embeddings, which allow for a fast implementation. The presented results, which rely on chaining methods, quantify how the achievable embedding dimension depends on the `intrinsic dimension’ of the original data set. If time permits, I will discuss applications to manifold embeddings and constrained least squares programs.
This talk is partly based on joint work with Jean Bourgain (IAS Princeton) and Jelani Nelson (Harvard).