Mathematics of Machine Learning

Description:

The aim of the course to recap the mathematical tools needed for understanding machine learning algorithms via elaborating ML-related topics like embeddings, automatic differentiation or Bayesian inference theory. Beside the theoretical investigation, we will implement simple Python codes for experimenting with the studied concepts. The course is rather interdisciplinary: the connection between the different areas of mathematics is strongly highlighted, the recap of basics from higher mathematics is really oriented and computer-aided experiments are used for all concepts.

Curriculum:

The overall goal of the course to recap the definitions and the most important theorems/relationships of the following topics:
I. linear algebra
II. calculus (univariate-, multivariate), optimization
III. stochastics, statistics
then investigate the following topics using the tools considered above:
I. embeddings, dimensionality reductions: principal component analysis, spectral embedding
II. automatic differentiation (special case: backpropagation), gradient-based methods
III. Fisher’s discriminant, Bayesian methodology

The following topics will also be considered:

  • basics of information theory: entropy, Kullback-Leibler divergence
  • Lagrange multiplier and regularization
  • E-M algorithms for Gaussian mixture

Evaluation:

1. Summary about an advanced algorithm in linear algebra, calculus, stochastics or information theory (5 pts) – week 2.

2. Midterm (oral) exam (10 pts) – week 6.

3. Summary and presentation about an advanced algorithm in dimensionality reduction (15 pts) – week 13.