Introductory Applied Machine Learning
Course Homepage

Lecturers: Nigel Goddard
TAs: TBC Stefanos Angelidis, Victor Jose Hernandez Urbina
Course reps: Victor Chen, Sanchit Gupta

The goal of this course is to introduce students to basic algorithms for learning from examples, focusing on classification and clustering problems. This is a level 10 course intended for MSc students and 3rd/4th year undergraduates.

For an overview of the planned course topics, see the Course Catalog entry and the 2015 introductory handout.


16/11/2015 - The kNN quiz and Mixture Models/EM quiz are online, please complete by Wednesday for review on Thursday. 9/11/2015 - The Clustering quiz and remaining online lectures are posted.
1/10/2015 - Naive Bayes quiz posted. ITO should be allocating tutorials and labs by the end of the week.
28/9/2015 - I will use the results of the quizzes this week (week 2) to decide what, if anything, to do in the lecture slots next week (week 3). If everyone is doing well in the quizzes, we will skip the lecture slots, otherwise there will be an optional review session for those that need it. Check the schedule below to see if a review is happening.
28/9/2015 - For the next few weeks this webpage will be updating quite frequently, so check it. I will use the email list for important announcements but not every change.
27/9/2015 - Lectures this week and next will be online. Review the videos, then take the quiz. If you are not satisfied with your quiz performance, review the videos for the relevant slides (see the revise links).


Lecture slots are 14:10-15:00 Mondays and Thursdays in Appleton Tower, Lecture Theatre 3

This year for the first time we will be trialling some "flipped-classroom" methods, and will be looking for student feedback during the course how this is working for you. Approximately half of the material will be delivered in the traditional fashion via a lecture during the lecture slots above. The other material will be delivered via a combination of online short video segments (overall approximately the same length as a traditional lecture), which you should watch before the lecture slot. During some of the lecture slots, we will have other activities to review the material in the videos.


Discussion Forum

You will get an email invite to join the forum. The forum will contain detailed lecture notes annotated with questions and answers. It is monitored by the lecturer and the TAs, so if you ask questions here, you are likely to get a much faster response than if you email the lecturer or TAs individually. (However, if you have issues that should be kept confidential, then of course please do email the course lecturer.)


There will be two equally-weighted assignments together worth 25% of the mark for the course. These will be released, and submissions due, according to the schedule below.

Assignment Released Due
1 Week 5, Monday, 4 p.m. Week 7, Monday, 4 p.m.
2 Week 8, Friday, 4 p.m. Week 11, Monday, 4 p.m.

You are required to submit both an electronic copy and a manual copy to the ITO by the deadlines. The deadlines are strictly enforced.

Assignment #1

The question sheet is here: assignment1.pdf [discuss]. It is due at 1600 on Monday in Week 7 (Monday 2nd November 2015) by manual submission to ITO and electronic submission (please see question sheet for detailed submission instructions). The data files are here: train_20news_partA.arff, train_20news_partB.arff train_auto_partA.arff, train_auto_partB_numeric.arff.
Marks will be returned within two weeks of the due date.

Assignment #2

The question sheet is here: assignment_2.pdf [discuss]. It is due at 1600 on Monday in Week 11 (Mon 30 Nov 2015) by manual submission to ITO and electronic submission (please see question sheet for detailed submission instructions). The data files are here: cluster_means.txt, train_20news_partA.arff, train_mnist_dd01_partB.arff, train_mnist_dd02_partB.arff, train_mnist_binary_partC.arff, train_mnist_binary_pairConj_partC.arff, train_images_partB.arff, valid_images_partB.arff, test_images_partB.arff. The website where you can view images like the ones in the training and validation sets is here here.
Marks will be returned within two weeks of the due date.

Please read the Informatics policy on late submissions and plagiarism.


Tutorials will be in weeks 3, 5, 7 and 9. [Groups, times and rooms]


Labs will be in weeks 3,5,7 and 9. [Groups, times and rooms]

Week-by-Week listing

(This list is subject to change.)

Week 1

Lecture Theatre:
Mon: Lecture: Introduction [discuss] [notes] [video(2014)]
Thu: Lecture: Basic Probability and Estimation [discuss] [previous Q+A] [notes] [video(2014)] [Estimation Video(2014)]
Tutorials: No tutorials in week 1 (nor week 2)
Readings: Textbook chapters 1, 2
Mathematical preliminaries: These Supplementary Mathematics notes are from the old Learning from Data course. They are more difficult than what we will need for IAML, but if you are happy with them you should have no problem with the IAML maths level.

Week 2

Lecture Theatre: None
Online Lectures:
Thinking about Data [watch] [revise] [discuss] [previous Q+A] [notes] [quiz]
Naive Bayes Classifier [watch] [revise] [discuss] [notes] [quiz deadline 17:00 5/10/2015]
Tutorials: No tutorials in week 2
Readings: Textbook chapters 7.1, 7.2

Week 3

Lecture Theatre:
Thu: Review: Thinking about Data and Naive Bayes classification
Online Lectures:
Decision Tree learning [watch] [revise] [discuss] [notes] [quiz]
Generalization and Evaluation [watch] [revise and related] [discuss] [notes] [generalisation quiz] [evaluation quiz]
Tutorial 1: Naive Bayes and Data Representation [solutions]
Lab 1: Naive Bayes classification
Readings: Textbook chapters 4.2

Week 4

Lecture Theatre:
Mon: Free
Thu: Review: Generalisation and Evaluation
Online lecture
Nearest Neighbour [watch] [revise and related] [discuss] [notes] [quiz]
Tutorials: No labs or tutorials in week 4
Readings: Textbook chapters 3.2, 3.3, 4.3, 6.1, 6.5

Week 5

Lecture Theatre:
Mon: Lecture: Linear regression [discuss] [notes] [video(2014)]
Thu: Review: Decision Trees
Tutorial 2: Decision Tree and Gaussian Naive Bayes [solutions]
Lab 2: Attribute selection and Regression
Readings: Textbook chapter 4.6 (but pairwise classification, perceptron learning, Winnow are not required)

Week 6

Lecture Theatre:
Mon: Lecture: Logistic regression [slides] [pdf] [[video](2014)], Optimisation [slides] [pdf]
Thu: Lecture: Regularisation [slides] [pdf], Support vector machines part I [slides] [pdf] [video (2014)],
Readings: Textbook chapter 4.6 (but pairwise classification, perceptron learning, Winnow are not required); 6.3 (max margin hyperplane, nonlinear class boundaries), SVM handout. SV regression is not examinable.

Week 7

Lecture Theatre:
Mon: Lecture: Support vector machines Part II [slides] [pdf] [video (2014)]
Thu: Review: Nearest Neighbour
Online lecture
Clustering [watch] [revise] [discuss] [notes] [quiz]
Readings: Textbook chapters 5, 4.7, 6.4
Tutorial 3: Logistic regression [solutions]
Lab 3: Support Vector Machines, Evaluation

Week 8

Lecture Theatre:
Mon: Free
Thu: Review: k-Means clustering
Online Lectures/Quizzes
Expectation Maximization Algorithm [watch] [revise] [discuss] [notes] [quiz]
Dimensionality Reduction [watch] [revise and related] [discuss] [notes] [quiz]

Readings: Textbook chapters 4.8, 6.6

Week 9

Lecture Theatre:
Mon: Free
Thu: Review: Mixture Models, EM and Nearest Neighbours
Online lectures
Hierarchical clustering [watch] [revise] [discuss] [notes]
Tutorials: SVMs, Clustering
Lab 4: PCA, Clustering, Evaluation
Additional demo: Eigenfaces (not examinable)

Week 10

Lecture Theatre:
Mon: Take the Quiz: Dimensionality Reduction
Thu: Review: Dimensionality Reduction (and Hierarchical Clustering?)

This page is maintained by Nigel Goddard.

Home : Teaching : Courses 

Informatics Forum, 10 Crichton Street, Edinburgh, EH8 9AB, Scotland, UK
Tel: +44 131 651 5661, Fax: +44 131 651 1426, E-mail:
Please contact our webadmin with any comments or corrections. Logging and Cookies
Unless explicitly stated otherwise, all material is copyright © The University of Edinburgh