BannerHauptseite TUMHauptseite LehrstuhlMathematik SchriftzugHauptseite LehrstuhlHauptseite Fakultät

Mathematical Foundations of Artificial Neural Networks - Winter semester 2018

News

  • The lectures will take place on Wednesdays 12:15-13:45 in Garching-Forschungszentrum Room XXX (first meeting: October 17) and on Thursdays 10:15-11:45 in Garching-Forschungszentrum Room XXX
  • WARNING: this course is supposed to be an ADVANCED MATHEMATICAL COURSE mainly addressed to students of MATHEMATICS. This has been clearly mentioned in the description of the course below. There are currently registered several students from Computer Science/Informatics, Engineering, BWL etc. I think that this course is NOT suited for those who cannot fullfil the requirements/prerequisites: "MA1101 Linear Algebra and Discrete Structures 1, MA1101 Linear Algebra and Discrete Structures 2, MA1001 Analysis 1, MA1002 Analysis 2, MA3001 Functional Analysis, Approximation Theory, Fourier Analysis and Wavelets, MA4800 Foundations of Data Analysis". PLEASE, I really beg those who are not fulfilling these prerequisite to un-register and to allow the course to be kept focused on higher mathematical content. I hope very much for your understanding and I'm sure that by now there are at TUM plenty of courses in more applied fields such as Informatics on the topic of machine learning that could be more appealing and suited to students with an applied background.

Organization

Contents

This is an advanced course for Master and Doctoral students only, which intends to collect and present a selection of relevant mathematical results about the analysis of artificial neural networks. The course will not be complemented by exercises, but in depth theory will be presented. The course will present several results using a vast scope of different mathematical tools, including function space theory, theory of orthogonal polynomials, Fourier and wavelet analysis, compressed sensing. We will focus on the three fundamental issues: The first is about how well in general artificial neural networks can approximate complex functions; we consider shallow and deep neural networks and the their different approximation properties with respect to classes of smooth and non-smooth functions. The second issue is the stability properties of artificial neural networks with respect to perturbations of the inputs; it is by now well-known that the classification of neural networks be fouled by simple perturbations (such as perturbing one single pixel in an image) and we will explore the origin of this phenomenon; additionally we will show stability results for certain neural network architectures. The third aspect we shall consider is the learnability of artificial neural networks, in particular, how large the training set needs to be in order to be able to identify a network. For all these three aspects there are different approaches in the literature, using different mathematical methods, and we will try to give of them a systematic view.

-- MassimoFornasier - 13 Oct 2018