MLP 2017-18
| News Archive
| Lectures
| Labs
| Group Project
| Coursework
| Feedback
| Computing
| Piazza
| Github
Machine Learning Practical (MLP) 2017-18: Lectures
Semester 2
Lectures in semester 2 will start with a couple of sessions introducing the group projects, followed by a set of four guest lectures. Please note that some of the guest lectures will not be recorded.
All lectures at 11:10 on Wednesdays, George Square Theatre.
-
Wednesday 17 January 2018. Lecture Cancelled - Apologies.
-
Wednesday 24 January 2018. Introduction to group projects.
Slides.
-
Wednesday 31 January 2018. The GPU cluster; Questions and answers (projects and computing).
Slides.
-
Wednesday 7 February 2018. Guest lecture: Ben Allison, Amazon
Building Production Machine Learning Systems
This lecture will not be recorded.
-
Wednesday 14 February 2018. Guest lecture: Hakan Bilen, University of Edinburgh
Unsupervised learning of object landmarks from equivariance
Wednesday 21 February 2018.
NO LECTURES OR LABS - FLEXIBLE LEARNING WEEK.
-
Wednesday 28 February 2018. Guest lecture: Vincent Wan, Google
Speech synthesis using LSTM auto-encoders
This lecture will not be recorded.
-
Wednesday 7 March 2018. Guest lecture: Subramanian Ramamoorthy, University of Edinburgh and FiveAI
Problems for Machine Learning Practitioners from the Autonomous Driving Domain
This lecture will not be recorded.
Semester 1
Attend one of the following lectures each week.
Lecture recordings.
-
Wednesday 20 September 2017. Introduction to MLP. Single layer networks (1): linear networks, gradient descent.
Slides.
Reading: Goodfellow et al. chapter 1; sections 4.3, 5.1, 5.7
Lecture recordings.
-
Wednesday 27 September 2017 / Monday 2 October 2017. Single layer networks (2): stochastic gradient descent, minibatches, classification, cross-entropy, softmax.
Slides.
Reading: Nielsen chapter 1; Goodfellow et al. sections 5.9, 6.1, 6.2, 8.1
Lecture recordings.
-
Wednesday 4 October 2017 / Monday 9 October 2017. Deep neural networks (1): hidden layers, back-propagation training, activation functions
Slides.
Reading: Nielsen chapter 2; Goodfellow et al. sections 6.3, 6.4,; Bishop sections 3.1, 3.2, chapter 4
Lecture recordings.
-
Wednesday 11 October 2017 / Monday 16 October 2017. Deep neural networks (2): basics of generalisation, training algorithms, initialisation.
Slides.
Reading: Goodfellow et al, sections 5.2, 5.3, 8.3, 8.5, 7.8;
Karpathy, CS231n notes (Stanford).
Additional Reading: Kingma and Ba (2015), Adam: A Method for Stochastic Optimization, ICLR-2015; Glorot and Bengio (2010), Understanding the difficulty of training deep feedforward networks, AISTATS-2010.
Lecture recordings.
-
Wednesday 18 October 2017 / Monday 23 October 2017. Deep neural networks (3): regularisation and normalisation.
Slides.
Reading: Nielsen chapter 3; Goodfellow et al chapter 7 (sections 7.1--7.5, 7.12).
Additional Reading: Srivastava et al, Dropout: a simple way to prevent neural networks from overfitting, JMLR, 15(1), 1929-1958, 2014; Ioffe and Szegedy, Batch normalization, ICML-2015; Ba et al, Layer Normalization, arXiv:1607.06450.
Lecture recordings.
-
Wednesday 25 October 2017 / Monday 30 October 2017. Deep neural networks (4): computational graphs and autoencoders
Slides.
Reading:
Goodfellow et al, sec 6.5,
chapter 14 (esp 14.1, 14.2, 14.3, 14.5, 14.9)
Additional Reading:
Olah, Calculus on Computational Graphs: Backpropagation;
Kratzert, Understanding the backward pass through Batch Normalization Layer; Vincent et al, Stacked Denoising Autoencoders: Learning Useful Representations in a Deep Network with a Local Denoising Criterion, JMLR, 11:3371--3408, 2010.
Lecture recordings.
-
Wednesday 1 November 2017 / Monday 6 November 2017. Convolutional networks (1): introduction
Slides.
Reading:
Nielsen, chapter 6;
Goodfellow et al, chapter 9 (section 9.1-9.4)
Additional Reading:
LeCun et al, Gradient-Based Learning Applied to Document Recognition, Proc IEEE, 1998; Dumoulin and Visin, A guide to convolution arithmetic for deep learning, arXiv:1603.07285.
Lecture recordings.
-
Wednesday 8 November 2017 / Monday 13 November 2017. Convolutional networks (2): backprop training and deep convolutional networks
Slides.
Reading:
Goodfellow et al, chapter 9 (section 9.5-9.11)
Additional Reading:
Krizhevsky et al, ImageNet Classification with Deep Convolutional Neural Networks, NIPS-2012; Simonyan and Zisserman, Very Deep Convolutional Networks for Large-Scale Visual Recognition, ILSVRC-2014; He et al, Deep Residual Learning for Image Recognition, CVPR-2016.
Lecture recordings.
-
Wednesday 15 November 2017 / Monday 20 November 2017. Recurrent neural networks (1): Modelling sequential data
Slides.
Reading:
Goodfellow et al, chapter 10 (sections 10.1, 10.2, 10.3)
Additional Reading:
Mikolov et al, Recurrent Neural Network Based Language Model, Interspeech-2010.
Lecture recordings.
-
Wednesday 22 November 2017 / Monday 27 November 2017. Recurrent neural networks (2): LSTMs, gates, and some applications
Slides.
Reading:
Goodfellow et al, chapter 10 (sections 10.4, 10.5, 10.7, 10.10, 10.12)
Additional Reading:
C Olah, Understanding LSTMs;
A Karpathy et al (2015), Visualizing and Understanding Recurrent Networks, arXiv:1506.02078;
R Srivastava et al, Training Very Deep Networks, NIPS-2015
Lecture recordings.
Semester 2
All lectures at 11am on Wednesdays, George Square Theatre.
You can discuss and ask questions about these lectures on Piazza, or talk to me at one of the "Office" hours in the Appleton Tower cafe (the one on the concourse in front of the lecture theatres):
- Mondays 10am (after the 9am lecture)
- Wednesdays 10am (before the 11am lecture)
Textbooks
- Introductory:
Michael Nielsen, Neural Networks and Deep Learning, 2016. This free online book has excellent coverage of feed-forward networks, training by back-propagation, error functions, regularisation, and convolutional neural networks. It uses the MNIST data as a running example.
- Comprehensive:
Ian Goodfellow, Yoshua Bengio, and Aaron Courville, Deep Learning, 2016, MIT Press. Full-text of the book is available at the authors' web site.
- Older, but worthwhile reading:
Christopher M Bishop, Neural Networks for Pattern Recognition, 1995, Clarendon Press.
Copyright (c) University of Edinburgh 2015-2018
The MLP course material is licensed under the
Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International Public License
licence.txt
The software at https://github.com/CSTR-Edinburgh/mlpractical is licensed under the Modified BSD License.
This page maintained by Steve Renals.
Last updated: 2018/08/14 13:06:34UTC