JabRef references

# Christian Kümmerle

 Position Doctoral student E-mail christian.kuemmerlema.tum.de Telephone +49 (0) 89 289 17467 Room 02.10.033 Address Department of Mathematics Boltzmannstraße 3 85748 Garching (Munich) Germany

### Research Interests

 High-Dimensional Probability Theory, Compressive Sensing, Matrix Recovery Mathematical Data Analysis, Machine Learning, Non-Convex Optimization Signal Processing, Phase Retrieval, Super Resolution

[Home M15]

## Preprints

 F. Krahmer, C. Kümmerle and H. Rauhut. A Quotient Property for Matrices with Heavy-Tailed Entries and its Application to Noise-Blind Compressed Sensing, arXiv:1806.04261, 2018. [Abstract] [BibTeX] Abstract For a large class of random matrices A with i.i.d. entries we show that the $\ell_1$-quotient property holds with probability exponentially close to 1. In contrast to previous results, our analysis does not require concentration of the entrywise distributions. We provide a unified proof that recovers corresponding previous re- sults for (sub-)Gaussian and Weibull distributions. Our findings generalize known results on the geometry of random polytopes, providing lower bounds on the size of the largest Euclidean ball contained in the centrally symmetric polytope spanned by the columns of A.At the same time, our results establish robustness of noise-blind l1-decoders for recovering sparse vectors x from underdetermined, noisy linear measurements y=Ax+w under the weakest possible assumptions on the entrywise distributions that allow for recovery with optimal sample complexity even in the noiseless case. Our analysis predicts superior robustness behavior for measurement matrices with super-Gaussian entries, which we confirm by numerical experiments. BibTeX: @unpublished{KrahmerKuemmerleRauhut18, author = {Felix Krahmer, Christian Kümmerle, Holger Rauhut}, title = {A Quotient Property for Matrices with Heavy-Tailed Entries and its Application to Noise-Blind Compressed Sensing}, year = {2018}, doi = {https://arxiv.org/abs/1806.04261} }  C. Kümmerle and J. Sigl. Harmonic Mean Iteratively Reweighted Least Squares for Low-Rank Matrix Recovery, arXiv:1703.05038, 2017. [Abstract] [BibTeX] [supplementary code] Abstract We propose a new iteratively reweighted least squares (IRLS) algorithm for the recovery of a matrix X ∊ ℂ^(d_1× d_2) of rank r ≪ min(d_1,d_2) from incomplete linear observations, solving a sequence of low complexity linear problems. The easily implementable algorithm, which we call harmonic mean iteratively reweighted least squares (HM-IRLS), optimizes a non-convex Schatten-p quasi-norm penalization to promote low-rankness and carries three major strengths, in particular for the matrix completion setting. First, the algorithm converges globally to the low-rank matrix for relevant, interesting cases, for which any other (non-)convex state-of-the-art optimization approach fails the recovery. Secondly, HM-IRLS exhibits an empirical recovery probability close to 100% even for a number of measurements very close to the theoretical lower bound r (d_1 +d_2 -r), i.e., already for significantly fewer linear observations than any other tractable approach in the literature. Thirdly, HM-IRLS exhibits a locally superlinear rate of convergence (of order 2-p) if the linear observations fulfill a suitable null space property. While for the first two properties we have so far only strong empirical evidence, we prove the third property as our main theoretical result. BibTeX: @unpublished{KS17, author = {Kümmerle, C. and Sigl, J.}, title = {Harmonic Mean Iteratively Reweighted Least Squares for Low-Rank Matrix Recovery}, year = {2017}, url = {https://arxiv.org/abs/1703.05038} } 

### Talks

• Harmonic Mean Iteratively Reweighted Least Squares for Low-Rank Matrix Recovery. SampTA 2017, Tallinn, July 2017 [slides]

## News

2018
August 29–31 EURASIP Summer School Tensor-Based Signal Processing
August 1–9 International Congress of Mathematicians 2018 (ICM 2018)
July 26–31 Talk at the conference on Asymptotic and Affine Geometric Analysis  in Rio de Janeiro, Brasil
June 25–29 Visit at the group of Prof. Holger Rauhut  at RWTH Aachen
June 12–13 Visit at the group of Prof. Daniel Potts  at TU Chemnitz
June 11 New preprint about A Quotient Property for Matrices with Heavy-Tailed Entries and its Application to Noise-Blind Compressed Sensing , joint with Felix Krahmer and Holger Rauhut.
March 25–31 Workshop on Applied Harmonic Analysis and Data Processing  in Oberwolfach
March 19–23 Talk at the 89th GAMM Annual Meeting  about "Super-Resolution meets Seismology: Seismic Data Interpolation via Structured Low-Rank Matrix Recovery" in Munich
February 26 – March 2 Poster presentation at the Winter School Modern Methods in Nonsmooth Optimization  about "Iteratively Reweighted Least Squares Algorithms for Low-Complexity Matrix Recovery Problems" in Würzburg
2017
December 4–8 presenting a poster about The Quotient Property and the Robustness of \ell_1-minimization under Weak Moment Assumptions at the 3. International Matheon Conference on Compressed Sensing and its Applications  at TU Berlin, Berlin
Sep. 4–
Oct. 22
visiting Prof. Jianwei Ma  in Harbin, China
July 13–19 presenting a poster at Foundations of Computational Mathematics 2017  in Barcelona, Spain
July 3–7 giving a talk on Harmonic Mean Iteratively Reweighted Least Squares for Low-Rank Matrix Recovery [slides] at the 12th International Conference on Sampling Theory and Applications (SampTA 2017)  in Tallinn, Estonia
June 5–8 presenting a poster at SPARS 2017  in Lisbon, Portugal
March 16 submitted the paper ''Harmonic Mean Iteratively Reweighted Least Squares for Low-Rank Matrix Recovery'' [.pdf] together with my colleague Juliane Sigl. Supplementary code can be found here.
2016
July 18–22 participating in the Summer School on Mathematical Methods for High-Dimensional Data Analysis at TU München
June 27–July 1 participating in the Summer School on Regularization Methods for Machine Learning 2016  at University of Genoa, Italy
April 18–22 participating in the Hausdorff School on Low-rank Tensor Techniques in Numerical Analysis and Optimization  at Universität Bonn
March 14–18 participating in the Workshop on Harmonic Analysis, Graphs and Learning , Hausdorff Research Institute for Mathematics, Bonn
February 15–19 participating in the Workshop on Low Complexity Models in Signal Processing , Hausdorff Research Institute for Mathematics, Bonn
February 12 co-organizing the Second Workshop Donau-Isar-Inn: WDI² - Approximation Theory and Applications at TU München
January 11–15 participating and presenting a poster at the Winter School on Advances in Mathematics of Signal Processing , Hausdorff Research Institute for Mathematics, Bonn
January 4 – April 22 Guest researcher in the Special Trimester Program ”Mathematics of Signal Processing” , Hausdorff Research Institute for Mathematics, Bonn
2015
December 7–11 participating in the 2. International Matheon Conference on Compressed Sensing and its Applications  at TU Berlin, Berlin
December 3–5 attending the Winter School on Compressed Sensing 2015  at TU Berlin, Berlin
October 26–30 participating in the conference "Convexity, probability and discrete structures, a geometric view point"  in Marne-la-Vallée, France
October 1 started his work as a doctoral student at the chair for Applied Numerical Analysis