MLPR lecture log, Autumn 2017
Here is where I will put any scans or recordings from each lecture. If
anything is unclear, please refer to the actual class
notes. Also, please still take your own notes in class!
Videos take at least an hour to process. They are available in a fancy web
interface on “Media Hopper Replay” / “Echo 360” To get there,
go to Learn, once you are signed up for the
class, and follow the "Media Hopper Replay" link in the sidebar for the class.
I won't use Learn for anything else to do with the class.
There were .mp4 file links here but the videos are no longer available. Sorry.
PDF scans of what I wrote under the document camera in class, and mp4 videos:
- Lecture 1, Monday week 1:
Logistics and motivation.
- Lecture 2, Wednesday week 1:
Linear functions and least squares.
- Lecture 3, Thursday week 1:
Introduction to basis functions and L2 regularization.
- Lecture 4, Monday week 2:
Generalization and dataset splits.
- Lecture 5, Wednesday week 2:
More generalization, Gaussians, CLT, standard error bars.
w2b, w2c, w2d
- Lecture 6, Thursday week 2:
Different sources of variability. Multivariate Gaussians.
- Lecture 7, Monday week 3:
Gaussian classifiers. Regressing on labels.
- Lecture 8, Wednesday week 3:
More on baseline classifiers, some calculus.
- Lecture 9, Thursday week 3:
Gradients, linear regression to logistic regression.
- Lecture 10, Monday week 4:
Stochastic gradients, softmax regression
- Lecture 11, Wednesday week 4:
Robust logistic regression (cost functions from probabilistic models)
- Lecture 12, Thursday week 4:
Feedforward neural nets
- Lecture 13, Monday week 5:
Fitting neural nets, start of back-propagation
- Lecture 14, Wednesday week 5:
John Quinn guest lecture: Jupyter notebook.
- Lecture 15, Thursday week 5:
Reverse mode differentiation with matrices (+ reflection on Quinn's lecture and mid-semester survey)
- Lecture 16, Monday week 6:
Autoencoders and PCA
- Lecture 17, Wednesday week 6:
PCA continued, SVD, Netflix prize, privacy
- Lecture 18, Thursday week 6:
pdf, extra slides
Probabilistic and Bayesian regression
The video cut off a few seconds from the end, but nothing important. I just said the posterior is some Gaussian, as at the end of the typeset notes:
- Lecture 19, Monday week 7:
Bayesian inference and prediction
- Lecture 20, Wednesday week 7:
Bayesian linear regression review, and Bayesian model choice
- Lecture 21, Thursday week 7:
More on probabilistic reasoning, and start of Gaussian processes
Python demo to match start of lecture,
- Lecture 22, Monday week 8:
More on Gaussian processes
A minimal GP demo: matlab/octave, python
Alternative GP demo: matlab/octave, python
- Lecture 23, Wednesday week 8:
Finish GPs (with kernel logistic regression aside), start Bayesian logistic regression
GP readings as in Lecture 22
Preview of w8b
- Lecture 24, Thursday week 8:
Bayesian logistic regression and the Laplace approximation
- Lecture 25, Monday week 9:
More on Gaussian approximations, KL-divergence and variational methods
- Lecture 26, Wednesday week 9:
Variational methods continued
A minimal stochastic variational inference demo: Matlab/Octave: single-file, more complete tar-ball; Python version.
- Lecture 27, Thursday week 9:
Mixtures of Gaussians for clustering and density estimation
- Lecture 28, Monday week 10:
Bound-based optimizers, Newton's method, L1 regularization
- Lecture 29, Wednesday week 10:
Ensembles, solving different problems
- Lecture 30, Thursday week 10:
Exam preparation advice.
That’s all folks! Have a great winter break.
|Informatics Forum, 10 Crichton Street, Edinburgh, EH8 9AB, Scotland, UK
Tel: +44 131 651 5661, Fax: +44 131 651 1426, E-mail:
Please contact our webadmin with
any comments or corrections. Logging and Cookies
Unless explicitly stated otherwise, all material is copyright ©
The University of Edinburgh