# Random Matrix Theory (MA5346) - SS 20

Lecturer | Felix Krahmer | felix.krahmertum.de | MI 02.10.039 |

Tutor | Stefan Bamberger | stefan.bambergertum.de | MI 02.10.033 |

Lecture | Tue. 10:15 - 11:45 | (Interims Hörsaal 1), online due to COVID-19 pandemic | |

Tutorial | t.b.d. |

## News

Due to the COVID-19 pandemic, the class will not be held in person. A prerecorded video will be uploaded for each lecture. More details are given in the Moodle course.## Content of the Lecture

A random matrix A is a matrix-valued random variable. For example, the entries may be i.i.d. scalar Gaussian or Bernoulli random variables, but random matrices with dependent entries will also be considered. There are many classical results about the asymptotic behaviour of the spectrum of such matrices, like for example Wigner's semicircle law. However, it has recently become important to also understand the non-asymptotic spectral behaviour of random matrices. A typical question of interest is for example the following: Consider random matrices A of size m x N, how much do the singular values of their realizations differ from the predicted asymptotic behaviour? The class will, in large parts, follow the lecture notes by Roman Vershynin [1]. We will start with non-asymptotic deviation estimates for random variables in one dimension. We introduce subgaussian and sub exponential random variables and random vectors as well as the isotropic random vectors. The concepts then appear in the study of random matrices, mainly matrices with independent rows or columns. We will prove results on the tail behaviour of their maximal singular values for different types of distributions for the row/column vectors. Furthermore various applications will be discussed, including dimension reduction [2] and compressed sensing [3].## Lecture Material

All additional material is given in the Moodle course.## Literature

[1] Vershynin, R.: Introduction to the non-asymptotic analysis of random matrices, in Compressed Sensing, Theory and Applications, ed. Y. Eldar and G. Kutyniok. Cambridge University Press, 2012, pp. 210-268 [.pdf^{}]

[2] Krahmer, F. and Ward, R.: New and improved Johnson-Lindenstrauss embeddings via the Restricted Isometry Property, SIAM J. Math. Anal. 43(3), 2011, 1269-1281.

[3] Foucart, S. and Rauhut, H.: A Mathematical Introduction to Compressive Sensing, Applied and Numerical Harmonic Analysis, Birkhäuser, 2013

[4] Tropp, J.: User-friendly tail bounds for sums of random matrices, Found. Comp. Math. 12(4), 2012, 389-434.

-- StefanBamberger - 20 Apr 2020