MLPR 2020 Activities Notes Forum Assessment FAQ Feedback Accessibility

MLPR class notes

When learning advanced material, you won't immediately understand everything just from reading notes. Please sign up to the forum, ask questions, and share insights and external materials that you have discovered.

Keyboard shortcut: step through the notes using the left and right arrow keys.

Please annotate the HTML versions of the notes in the forum, to keep the class's comments together. You can show/hide links to PDF versions.

Background information

Week 1: Introduction to ML with Linear Regression

Week 2: ML fundamentals: generalization, error bars, Gaussians

Week 3: Classification and gradient-based fitting

Week 4: Bayesian linear regression

Week 5: Bayesian model choice and Gaussian processes

Week 6: More detailed models: Gaussian process kernels, more non-Gaussian regression

Week 7: Reading week

Week 8: Neural Networks

Week 9: Autoencoders, PCA, Netflix Prize

Week 10: Bayesian logistic regression, Laplace approximation

Week 11: Sampling-based approximate Bayesian inference, variational inference

 

A coarse overview of major topics covered is below. Some principles aren't taught alone as they're useful in multiple contexts, such as gradient-based optimization, different regularization methods, ethics, and practical choices such as feature engineering or numerical implementation.

You are encouraged to write your own outlines and summaries of the course. Aim to make connections between topics, and imagine trying to explain to someone else what the main concepts of the course are.