JabRef references

# Christian Kümmerle

 Position Doctoral student E-mail kuemmerlejhu.edu Address Department of Mathematics Boltzmannstraße 3 85748 Garching (Munich) Germany

My research is related to problems in machine learning, statistics and signal processing. In particular, I study optimization methods to solve problems in these areas in a scalable, robust and data-efficient manner.

In this context I have worked on low-rank matrix models, iteratively reweighted algorithms and high-dimensional probability theory, and their applications to matrix completion, robust principal component analysis, compressed sensing and super resolution.

You find some code about my research on GitHub .

## Preprints

 O. Guédon, F. Krahmer, C. Kümmerle, S. Mendelson and H. Rauhut. On the geometry of polytopes generated by heavy-tailed random vectors, arXiv:1907.07258, 2019. [BibTeX] BibTeX: @unpublished{GKKMR19, author = {Olivier Guédon, Felix Krahmer, Christian Kümmerle, Shahar Mendelson, Holger Rauhut}, title = {On the geometry of polytopes generated by heavy-tailed random vectors, year = {2019}, doi = {https://arxiv.org/abs/1907.07258} }  F. Krahmer, C. Kümmerle and H. Rauhut. A Quotient Property for Matrices with Heavy-Tailed Entries and its Application to Noise-Blind Compressed Sensing, arXiv:1806.04261, 2018. [Abstract] [BibTeX] Abstract For a large class of random matrices A with i.i.d. entries we show that the $\ell_1$-quotient property holds with probability exponentially close to 1. In contrast to previous results, our analysis does not require concentration of the entrywise distributions. We provide a unified proof that recovers corresponding previous re- sults for (sub-)Gaussian and Weibull distributions. Our findings generalize known results on the geometry of random polytopes, providing lower bounds on the size of the largest Euclidean ball contained in the centrally symmetric polytope spanned by the columns of A.At the same time, our results establish robustness of noise-blind l1-decoders for recovering sparse vectors x from underdetermined, noisy linear measurements y=Ax+w under the weakest possible assumptions on the entrywise distributions that allow for recovery with optimal sample complexity even in the noiseless case. Our analysis predicts superior robustness behavior for measurement matrices with super-Gaussian entries, which we confirm by numerical experiments. BibTeX: @unpublished{KrahmerKuemmerleRauhut18, author = {Felix Krahmer, Christian Kümmerle, Holger Rauhut}, title = {A Quotient Property for Matrices with Heavy-Tailed Entries and its Application to Noise-Blind Compressed Sensing}, year = {2018}, doi = {https://arxiv.org/abs/1806.04261} } 

## Journal Articles

 C. Kümmerle and J. Sigl. Harmonic Mean Iteratively Reweighted Least Squares for Low-Rank Matrix Recovery, Journal of Machine Learning Research, 19(47):1−49, 2018. [Abstract] [BibTeX] [arXiv] [Code ] Abstract We propose a new iteratively reweighted least squares (IRLS) algorithm for the recovery of a matrix X ∊ ℂ^(d_1× d_2) of rank r ≪ min(d_1,d_2) from incomplete linear observations, solving a sequence of low complexity linear problems. The easily implementable algorithm, which we call harmonic mean iteratively reweighted least squares (HM-IRLS), optimizes a non-convex Schatten-p quasi-norm penalization to promote low-rankness and carries three major strengths, in particular for the matrix completion setting. First, the algorithm converges globally to the low-rank matrix for relevant, interesting cases, for which any other (non-)convex state-of-the-art optimization approach fails the recovery. Secondly, HM-IRLS exhibits an empirical recovery probability close to 100% even for a number of measurements very close to the theoretical lower bound r (d_1 +d_2 -r), i.e., already for significantly fewer linear observations than any other tractable approach in the literature. Thirdly, HM-IRLS exhibits a locally superlinear rate of convergence (of order 2-p) if the linear observations fulfill a suitable null space property. While for the first two properties we have so far only strong empirical evidence, we prove the third property as our main theoretical result. BibTeX: @article{JMLR:v19:17-244, author = {Christian K{\"u}mmerle and Juliane Sigl}, title = {Harmonic Mean Iteratively Reweighted Least Squares for Low-Rank Matrix Recovery}, journal = {Journal of Machine Learning Research}, year = {2018}, volume = {19}, number = {47}, pages = {1-49}, url = {http://jmlr.org/papers/v19/17-244.html} } 

## Conference Papers

 C. Kümmerle and C. Mayrink Verdun. Completion of Structured Low-Rank Matrices via Iteratively Reweighted Least Squares, In 13th International Conference on Sampling Theory and Applications (SampTA), 2019. [BibTeX] BibTeX: @inproceedings{KV19_SampTA19, author = {C. Kümmerle and J. Sigl}, title = {Harmonic Mean Iteratively Reweighted Least Squares for Low-Rank Matrix Recovery}, booktitle = {13th International Conference on Sampling Theory and Applications (SampTA)}, year = {2019}, }  C. Kümmerle and C. Mayrink Verdun. Denoising and Completion of Structured Low-Rank Matrices via Iteratively Reweighted Least Squares , In International Traveling Workshop on Interactions between Low-Complexity Data Models and Sensing Techniques (iTWIST), Marseille, France, 2018. [Abstract] [BibTeX] Abstract We propose a new Iteratively Reweighted Least Squares (IRLS) algorithm for the problem of completing or denoising low-rank matrices that are structured, e.g., that possess a Hankel, Toeplitz or block-Hankel/Toeplitz structure. The algorithm optimizes an objective based on a non-convex surrogate of the rank by solving a sequence of quadratic problems. Our strategy combines computational efficiency, as it operates on a lower dimensional generator space of the structured matrices, with high statistical accuracy which can be observed in experiments on hard estimation and completion tasks. Our experiments show that the proposed algorithm StrucHMIRLS exhibits an empirical recovery probability close to 1 from fewer samples than the state-of-the-art in a Hankel matrix completion task arising from the problem of spectral super-resolution of badly separated frequencies. Furthermore, we explain how the proposed algorithm for structured low-rank recovery can be used as preprocessing step for improved robustness in frequency or line spectrum estimation problems. BibTeX: @inproceedings{KV18_iTwist18, author = {C. Kümmerle and C. Mayrink Verdun}, title = {Denoising and Completion of Structured Low-Rank Matrices via Iteratively Reweighted Least Squares}, booktitle = {International Traveling Workshop on Interactions between Low-Complexity Data Models and Sensing Techniques (iTWIST), Marseille, France}, year = {2018} }  C. Kümmerle and J. Sigl. Harmonic Mean Iteratively Reweighted Least Squares for Low-Rank Matrix Recovery, In 12th International Conference on Sampling Theory and Applications (SampTA), pp. 489-493, 2017. [BibTeX] BibTeX: @inproceedings{HMIRLS_SampTA17, author = {C. Kümmerle and J. Sigl}, title = {Harmonic Mean Iteratively Reweighted Least Squares for Low-Rank Matrix Recovery}, booktitle = {12th International Conference on Sampling Theory and Applications (SampTA)}, year = {2017}, pages = {489-493} } 

## News

2020
From January Postdoctoral Fellow at Johns Hopkins University, Department of Applied Mathematics & Statistics
2019
December 19 Doctoral Defense of my dissertation on Understanding and Enhancing Data Recovery Algorithms: From Noise-Blind Sparse Recovery to Reweighted Methods for Low-Rank Matrix Optimization
October-December Visiting Prof. Mauro Maggioni  at Johns Hopkins University, Baltimore, USA
September 12-13 Workshop on Low-Rank Models and Applications (LRMA)  in Mons, Belgium
September 9-11 Visiting Prof. Ivan Markovsky  at Vrije Universiteit Brussel, Belgium
July 8–12 Invited Talk on Completion of Structured Low-Rank Matrices via Iteratively Reweighted Least Squares at the 13th International Conference on Sampling Theory and Applications (SampTA 2019)  in Bordeaux, France
2018
December 17–19 Vienna Workshop of Computational Optimization 2018 (VWCO18)
November 21–23 iTWIST: international Traveling Workshop on Interactions between low-complexity data models and Sensing Techniques , Marseille, France
November 13 Oberseminar talk at the Applied Analysis Seminar , Universität Osnabrück
October 14–19 Oberwolfach Seminar: Mathematics of Deep Learning
September 12–13 Retreat of the groups of Prof. Holger Rauhut  and Prof. Rudolf Mathar  in Spa
August 29–31 EURASIP Summer School Tensor-Based Signal Processing
August 1–9 International Congress of Mathematicians 2018 (ICM 2018)
July 26–31 Talk at the conference on Asymptotic and Affine Geometric Analysis  in Rio de Janeiro, Brasil
June 25–29 Visit at the group of Prof. Holger Rauhut  at RWTH Aachen
June 12–13 Visit at the group of Prof. Daniel Potts  at TU Chemnitz
June 11 New preprint about A Quotient Property for Matrices with Heavy-Tailed Entries and its Application to Noise-Blind Compressed Sensing , joint with Felix Krahmer and Holger Rauhut.
March 25–31 Workshop on Applied Harmonic Analysis and Data Processing  in Oberwolfach
March 19–23 Talk at the 89th GAMM Annual Meeting  about "Super-Resolution meets Seismology: Seismic Data Interpolation via Structured Low-Rank Matrix Recovery" in Munich
February 26 – March 2 Poster presentation at the Winter School Modern Methods in Nonsmooth Optimization  about "Iteratively Reweighted Least Squares Algorithms for Low-Complexity Matrix Recovery Problems" in Würzburg
2017
December 4–8 presenting a poster about The Quotient Property and the Robustness of \ell_1-minimization under Weak Moment Assumptions at the 3. International Matheon Conference on Compressed Sensing and its Applications  at TU Berlin, Berlin July 13–19 presenting a poster at Foundations of Computational Mathematics 2017  in Barcelona, Spain
July 3–7 giving a talk on Harmonic Mean Iteratively Reweighted Least Squares for Low-Rank Matrix Recovery [slides] at the 12th International Conference on Sampling Theory and Applications (SampTA 2017)  in Tallinn, Estonia
June 5–8 presenting a poster at SPARS 2017  in Lisbon, Portugal
March 16 submitted the paper ''Harmonic Mean Iteratively Reweighted Least Squares for Low-Rank Matrix Recovery'' [.pdf] together with my colleague Juliane Sigl. Supplementary code can be found here.
2016
July 18–22 participating in the Summer School on Mathematical Methods for High-Dimensional Data Analysis at TU München
June 27–July 1 participating in the Summer School on Regularization Methods for Machine Learning 2016  at University of Genoa, Italy
April 18–22 participating in the Hausdorff School on Low-rank Tensor Techniques in Numerical Analysis and Optimization  at Universität Bonn
March 14–18 participating in the Workshop on Harmonic Analysis, Graphs and Learning , Hausdorff Research Institute for Mathematics, Bonn
February 15–19 participating in the Workshop on Low Complexity Models in Signal Processing , Hausdorff Research Institute for Mathematics, Bonn
February 12 co-organizing the Second Workshop Donau-Isar-Inn: WDI² - Approximation Theory and Applications at TU München
January 11–15 participating and presenting a poster at the Winter School on Advances in Mathematics of Signal Processing , Hausdorff Research Institute for Mathematics, Bonn
January 4 – April 22 Guest researcher in the Special Trimester Program ”Mathematics of Signal Processing” , Hausdorff Research Institute for Mathematics, Bonn
2015
December 7–11 participating in the 2. International Matheon Conference on Compressed Sensing and its Applications  at TU Berlin, Berlin
December 3–5 attending the Winter School on Compressed Sensing 2015  at TU Berlin, Berlin
October 26–30 participating in the conference "Convexity, probability and discrete structures, a geometric view point"  in Marne-la-Vallée, France
October 1 started his work as a doctoral student at the chair for Applied Numerical Analysis