MLPR lectures, Autumn 2018
PDF scans: The PDF links below are to scans of what was handwritten
during each lecture. If anything is unclear, please refer to the actual class notes. I give direct links to the relevant
parts. Also, please still take your own notes in class!
- Lecture 1, Monday week 1:
Logistics and motivation.
- Lecture 2, Wednesday week 1:
Linear functions and least squares.
- Lecture 3, Thursday week 1:
Introduction to basis functions and L2 regularization.
- Lecture 4, Monday week 2:
Generalization and dataset splits.
- Lecture 5, Wednesday week 2:
More generalization, Gaussians, CLT.
- Lecture 6, Thursday week 2:
Standard error bars, different sources of variability. Multivariate Gaussians.
- Lecture 7, Monday week 3:
- Lecture 8, Wednesday week 3:
More on baseline classifiers, regressing on labels.
- Lecture 9, Thursday week 3:
Gradients, linear regression to logistic regression.
- Lecture 10, Monday week 4:
Stochastic gradients, softmax regression
- Lecture 11, Wednesday week 4:
Robust logistic regression (cost functions from probabilistic models)
- Lecture 12, Thursday week 4:
Feedforward neural nets
- Lecture 13, Monday week 5:
Fitting neural nets, initial outline of back-propagation
- Lecture 14, Wednesday week 5:
Reverse mode differentiation (back-propagation) with matrices
- Lecture 15, Thursday week 5:
Autoencoders and PCA/linear-autoencoder demos (PCA is covered in the next lecture)
- Lecture 16, Monday week 6:
PCA continued, SVD, Netflix prize
- Lecture 17, Wednesday week 6:
Netflix and privacy. Start of probabilistic and Bayesian regression
- Lecture 18, Thursday week 6:
Bayesian inference and prediction
- Lecture 19, Monday week 7:
Bayesian inference and prediction continued
- Lecture 20, Wednesday week 7:
Bayesian linear regression review, and Bayesian model choice
Python demo to match end of lecture,
- Lecture 21, Thursday week 7:
Gaussian process priors
A minimal GP demo: matlab/octave, python
Alternative GP demo: matlab/octave, python
- Lecture 22, Monday week 8:
Gaussian processes for regression and relationship to linear regression
w7b, code as above, and
- Lecture 23, Wednesday week 8:
Finish GPs (with kernel logistic regression aside)
GP readings as in Lecture 22
- Lecture 24, Thursday week 8:
pdf (there's a spurious 1/S in the final line of p5)
Bayesian logistic regression and the Laplace approximation
(some parts will be covered next week)
- Lecture 25, Monday week 9:
More on Bayesian logistic regression and the Laplace approximation.
- Lecture 26, Wednesday week 9:
KL-divergence and variational methods
- Lecture 27, Thursday week 9:
Some motivation for Gaussian mixture models:
Stochastic variational inference:
A minimal stochastic variational inference demo: Matlab/Octave: single-file, more complete tar-ball; Python version.
Apologies: I was forced to present the material out of order because the
document camera wasn't working (again) at the start of the lecture. I have
made a plea for the setup to be replaced.
- Lecture 28, Monday week 10:
Fitting Gaussian mixtures with gradients, or EM a bound-based optimizer
- Lecture 29, Wednesday week 10:
More on optimization, and ensembles
- Lecture 30, Thursday week 10:
Exam preparation advice
(alternative EASE link).
That’s all folks! Have a great winter break.
Those enrolled on the class, please take the class survey.
|Informatics Forum, 10 Crichton Street, Edinburgh, EH8 9AB, Scotland, UK
Tel: +44 131 651 5661, Fax: +44 131 651 1426, E-mail:
Please contact our webadmin with
any comments or corrections. Logging and Cookies
Unless explicitly stated otherwise, all material is copyright ©
The University of Edinburgh