MLPR 2019 | Notes | Lectures | Forum | Tutorials | Assignments | Feedback

These notes were written from scratch for this class.
We will respond to your comments and questions, and fix
or expand parts if and when necessary. However, effort from you is also
required. *Please* sign up to the forum, and ask
questions.

You can step through the HTML version of these notes using the left and right arrow keys.

Each note links to a PDF version for better printing. However, if possible,
**please annotate the HTML versions of the notes** in the forum, to keep the
class's comments together. If the HTML notes don't render well for you,
you could try in Chrome/Chromium. If you want quick access to the PDFs from
this page, you can toggle the
pdf links.

A rough indication of the schedule is given, although we won’t follow it exactly.

Background information

- w0a – Course administration, html, pdf.
- w0b – Books useful for MLPR, html, pdf.
- w0c – MLPR background self-test, html, pdf. Answers: html, pdf.
- w0d – Maths background for MLPR, html, pdf.
- w0e – Programming in Matlab/Octave or Python, html, pdf.
- w0f – Expectations and sums of variables, html, pdf.
- w0g – Notation, html, pdf.

Week 1:

- w1a – Course Introduction, html, pdf.
- w1b – Linear regression, html, pdf.
- w1c – Linear regression, overfitting, and regularization, html, pdf.

Week 2:

- w2a – Training, Testing, and Evaluating Different Models, html, pdf.
- w2b – Univariate Gaussians, html, pdf. Answers: html, pdf.
- w2c – The Central Limit Theorem (CLT), html, pdf. Answers: html, pdf.
- w2d – Error bars, html, pdf.
- w2e – Multivariate Gaussians, html, pdf.

Week 3:

- w3a – Classification: Regression, Gaussians, and pre-processing, html, pdf.
- w3b – Bayesian regression, html, pdf.

Week 4:

- w4a – Bayesian inference and prediction, html, pdf.
- w4b – Bayesian model choice, html, pdf.
- A Bayesian linear regression demo: matlab/octave, python

Week 5:

- w5a – Gaussian processes, html, pdf.
- A minimal GP demo: matlab/octave, python
- Alternative GP demo: matlab/octave, python
- w5b – Gaussian Processes and Kernels, html, pdf.

Week 6:

- w6a – Regression and Gradients, html, pdf.
- w6b – Logistic Regression, html, pdf.
- w6c – Softmax and robust regressions, html, pdf.

Week 7:

- w7a – Neural networks introduction, html, pdf.
- w7b – Fitting and initializing neural networks, html, pdf.
- w7c – Backpropagation of Derivatives, html, pdf.

Week 8:

- w8a – Autoencoders and Principal Components Analysis (PCA), html, pdf.
- w8b – Netflix Prize, html, pdf.
- w8c – Bayesian logistic regression and Laplace approximations, html, pdf. Answers: html, pdf.
- w8d – Computing logistic regression predictions, html, pdf.

Week 9:

- Tuesday (12 Nov): guest lecture by John Quinn. Notes and source code.
- w9a – Variational objectives and KL Divergence, html, pdf.
- w9b – More details on variational methods, html, pdf.
- A minimal stochastic variational inference demo: Matlab/Octave: single-file, more complete tar-ball; Python version.

Week 10:

- Lectures start with Monte Carlo section of w8d, then w9a and w9b.
- w10a – Gaussian mixture models, html, pdf.
- Exam preparation advice (alternative EASE link). These links won't work if viewing the page through the Hypothesis via proxy.
- Please take the class survey.

Bonus material (non-examinable):

- w10b – Sparsity and L1 regularization, html, pdf.
- w10c – More on optimization, html, pdf.
- w10d – Ensembles and model combination, html, pdf.

Week 11: No lectures, two Ed-Intelligence events:

- Wed 27 Nov 6–8pm, AT LT 2, Mini NeurIPS, please register.
- Fri 29 Nov 6–8pm, AT LT 5, To Err is Machine: Biases Failure and Fairness in AI, please register.

A coarse overview of major topics covered is below. Some principles aren't taught alone as they're useful in multiple contexts, such as gradient-based optimization, different regularization methods, ethics, and practical choices such as feature engineering or numerical implementation.

- Linear regression and ML introduction
- Evaluating and choosing methods from the zoo of possibilities
- Multivariate Gaussians
- Classification, generative and discriminative models
- Bayesian machine learning: linear regression, Gaussian processes and kernels
- Neural Networks
- Learning low-dimensional representations
- Approximate Inference: Bayesian logistic regression, Laplace, Variational
- Gaussian mixture models
- Time allowing: Other principles: sparsity/L1, ensembles: combination vs averaging.

You are encouraged to write your own outlines and summaries of the course. Aim to make connections between topics, and imagine trying to explain to someone else what the main concepts of the course are.

MLPR 2019 | Notes | Lectures | Forum | Tutorials | Assignments | FAQ | Feedback