Machine Learning Practical (MLP): Semester 1, 2015
[course descriptor]
News
- Additional lecture on recurrent neural networks (RNNs): 10:00, Friday 4 December 2016, Teviot Lecture Theatre, Medical School, Doorway 5.
- Second coursework now available on the MLP GitHub as
07_MLP_Coursework2.ipynb
- First coursework now available on the MLP GitHub as
03_MLP_Coursework1.ipynb
- Fix for notebook 'kernel error'. IF you were affected by the kernel error problem last week please see the fix in
kernel_issue_fix.md
on the MLP GitHub.
- Coursework update. The first coursework will be issued on Monday 12 October, with a deadline of Thursday 29 October, 4pm.
- Wednesday Lab Session. On Wednesdays 11:00, in both room FH-3.D01 and FH-3.D02.
- Additional Lab Session. Mondays, 12:00 in room FH-3.D01.
- GitHub. The lab material will be made available using GitHub at https://github.com/CSTR-Edinburgh/mlpractical.
- MLP Forum. Please sign up for the MLP Forum.
Introduction
The coursework-based Machine Learning Practical (MLP) is focused on the implementation and evaluation of machine learning systems. Students who do this course will have experience in the design, implementation, training, and evaluation of machine learning systems.
In 2015-16 the course will focus on deep neural networks.
MLP requires mathematical ability (calculus, linear algebra, and probability) and programming ability (the course will be based on Python using Numpy). Some previous experience of machine learning is is extremely helpful.
Undergraduates: If you have taken Informatics 2B and IAML, and can program, you are qualified to do the course.
It is also recommended to take MLPR.
MLP 2015: Deep Neural Networks
This year the MLP course will focus on deep neural networks for the classification of handwritten digits using the well-known MNIST dataset. Using a Python software framework that we shall provide, and a series of iPython notebooks, the aim of the course is to train multi-layer neural neural network classifiers and convolutional network classifiers to address this handwritten digit classification problem. There will be a series of eight weekly lectures to provide the required theoretical support to the practical work.
Schedule
The first lecture will be in Week 1: Wednesday 23 September at 10am, in F.21, 7 George Square.
- Lectures: Wednesdays, 10:00, room F.21, 7 George Square
- Labs:
Currently four lab sessions are scheduled, students are expected to attend one of these.
- Lecturer:
- Teaching assistant:
- Demonstrator:
Matt Graham
- Marker:
Vladimir Nikishkin
Lectures
- Wednesday 23 September. Introduction to MLP. Linear networks. Gradient descent. Slides / 6-up slides
- Wednesday 30 September. Stochastic gradient descent. Classification. Slides / 6-up slides
- Wednesday 7 October. Multilayer networks. Slides / 6-up slides
- Wednesday 14 October. Introduction to coursework 1; Generalisation (part 1). Slides / 6-up slides
- Wednesday 21 October. Learning rate schedules, genersalisation (part 2), more on softmax. Slides / 6-up slides
- Wednesday 28 October. Hidden unit transfer functions, autoencoders, and pretraining. Slides / 6-up slides
- Wednesday 4 November. Convolutional networks (1). Slides / 6-up slides
- Wednesday 11 November. Convolutional networks (2). Slides
- Friday 4 December. Recurrent neural networks. Slides. (Extra lecture, optional. This lecture will take place at 10:00, Friday 4 December 2016, Teviot Lecture Theatre, Medical School, Doorway 5.
You can discuss and ask questions about these lectures on the online MLP Forum.
Labs
Currently four lab sessions are scheduled, students are expected to attend one of these.
The lab material will be made available using github at https://github.com/CSTR-Edinburgh/mlpractical. You do not require a github login to use this. Labs will use iPython notebook
There are many Python/numpy tutorials on the web, I think that this is a good one: http://cs231n.github.io/python-numpy-tutorial/.
Coursework
- Coursework 1: Available 12 October 2015, Submit 29 October 2015, Feedback 12 November 2015. The coursework is now available as
03_MLP_Coursework1.ipynb
on the MLP GitHub. Do it in ipython notebook
and submit your resulting notebook.
- Coursework 2: Available 16 November 2015, Submit 14 January 2016, Feedback 4 February 2016. The coursework is now available as
07_MLP_Coursework2.ipynb
on the MLP GitHub. Do it in ipython notebook
and submit your resulting notebook.
Please make sure you have read and understood
Reading
Textbooks
- Michael Nielsen, Neural Networks and Deep Learning, 2015. This free online book has excellent coverage of feed-forward networks, training by back-propagation, error funcions, regularisation, and convolutional neural networks. It uses the MNIST data as a running example. This book also comes with a python codebase. Do not use this codebase in your practical work, use the one we provide! - The aim of this course is to enable you to do your own implementation. However, this book is an excellent resource and is highly recommended.
- Yoshua Bengio, Ian Goodfellow and Aaron Courville, Deep Learning, 2015, Book in preparation for MIT Press. Chapters 6-9 are most relevant.
- Christopher M Bishop, Neural Networks for Pattern Recognition, 1995, Clarendon Press.
Additional material
- Learning Rate Schedules: A Senior et al An empirical study of learning rates in deep neural networks, Proc IEEE ICASSP, 2013.
- Maxout: I Goodfellow et al Maxout networks, Proc ICML 2013.
- Denoising autoencoders: P Vincent et al, Stacked Denoising Autoencoders: Learning Useful Representations in a Deep Network with a Local Denoising Criterion, JMLR, 11:3371--3408, 2010.
- Y LeCun etl Efficient backprop, Neural Networks: Tricks of the Trade, Springer LNCS 1524, 1998.
- Convolutional Networks: Y LeCun et al Gradient-Based Learning Applied to Document Recognition, Proc IEEE, 1998.
Review articles (including material not covered in this course)
- Y Bengio, Learning deep architectures for AI, Foundations and trends in Machine Learning, 2009
- G Hinton, Boltzmann machine, Scholarpedia,2007.
- Y Lecun et al, A Tutorial on Energy-Based Learning, To appear in "Predicting Structured Data", MIT Press, 2006.
This page maintained by Steve Renals.
Last updated: 2016/08/10 17:04:34UTC