Probabilistic Modelling and Reasoning
This is a course for MSc level students. The course descriptor can
be found here
for the course is available.
: To do PMR you need a
reasonable background in statistics, calculus, linear algebra.
Please check the course descriptor
You should also thoroughly review the maths in the following
(from Iain Murray)
before the start of the course, and attempt the
A handout on Mathematical
which used to be used for the course LfD1 is also
useful for PMR; all material is relevant.
PMR lectures will be Lecture Theatre 2,
on Tuesdays and Fridays 1000-1050.
Information taken from
2012/13 lecture timetable.
Books. The course textbook is Pattern
Recognition and Machine Learning by Christopher M. Bishop,
- Other useful texts for some of the material are
Information Theory, Inference and Learning Algorithms by
David J. C. MacKay (Cambridge University Press, 2003; free online
Reasoning and Machine Learning by David Barber (Cambridge
University Press, 2012; free online version available), and
Intelligence: A Modern Approach by S. Russell and P. Norvig (Prentice Hall,
third edition, 2010).
page giving details of some books with maths background,
and some web-based resources
to more advanced mathematical material.
A handout Introduction
to MATLAB giving an introduction to MATLAB and the NETLAB neural networks
toolbox. Further MATLAB tutorials are available at
Matlab Tutorial, US
Navy Matlab Tutorial and MTU
Introduction to Matlab
Introduction to Graphical Models and Bayesian Networks by Kevin Murphy.
Thomas Minka's excellent
Learning/Pattern Recognition Glossary
Max Welling's Classnotes in Machine Learning
The Association for Uncertainty in Artificial
Intelligence homepage is a good place to start looking for interesting
The Kalman Filter
of Public Domain Belief Network Tools from a course on Belief Networks
at Duke University.
- Zoubin Ghahramani's tutorial/overview paper
Unsupervised Learning (original
link), in Bousquet, O., Raetsch, G. and von Luxburg, U. (eds) Advanced Lectures on Machine Learning LNAI 3176, Springer-Verlag (2004).
There will be one assignment in this course, worth 20%
of the overall mark for the course.
Handout date: 19 October, due date Fri 16 Nov (4pm).
Feedback after 2 weeks (approx).
Assignment 1 is due in on
Fri 16 Nov by 4pm by manual submission to the ITO.
Here is the a12.zip file obtaining data and
The NETLAB software is available via the downloads page from
link. Use version 3.3, you will also need foptions.m.
Tutorials will start in week 3.
Tutorial groups listing from ITO. This may only be visible
from within .ed.ac.uk. In case of problems with the
assignment to specific groups
contact the ISS via the
support form .
Sergey Dudoladov s1233200
We will be using the program JavaBayes, you can find more details
. We will also be using some MATLAB code.
Friday 11-12, starting in week 2.
It is probably best to catch me
after the lecture. I will then walk over to my office in IF 2.27.
PMR will be examined in the December 2012 exam diet.
Will build up here ...
Self-check maths sheet
Check if you can do the
No tutorials in week 1 (nor week 2)
Belief networks continued,
No tutorials in week 2, but see sheet below for
the tutorial in week 3
Worked example for Holmes/Watson
(Bayes net for the
Gaussian distribution ctd,
Maximum likelihood estimation slides
Bayesian parameter estimation
answer sheet ans1.pdf
on Monty Hall problem
with Gaussian Random Variables
Matlab code cointoss.m
matlab code to illustrate posterior distribution under Beta prior
by David Heckerman entitled "A tutorial on Learning Bayesian
NO LECTURE ON FRI 12 OCT
Mixture models slides
Working for Gaussian
answer sheet ans2.pdf
Mixture models continued,
Factor analysis and beyond slides
Working for EM for
Mixture of Gaussians
Handout Working for PCA
solution as principal eigenvector
Factor Analysis and Beyond
matlab code plotquad.m
answer sheet ans3_1213.pdf
Factor analysis and beyond ctd,
Bayesian Model Selection slides
Hidden Markov models slides
answer sheet ans4_1213.pdf
Lee's Blind source separation demo
Web resource Short explanation
of blind source separation
Paper on GTM
by C. M. Bishop, M. Svensen and C. K. I. Williams (original
from David MacKay's book Information Theory, Inference and
Learning Algorithms. See sections 28.1, 28.2 on model comparison
Hidden Markov Models continued,
Time Series Modelling and Kalman Filters
answer sheet ans5_1213.pdf
Working for alpha and beta
recursions for HMMs
L Rabiner tutorial on HMMs from Proceedings of the
IEEE 77(2) 1989 is available from the IEL electonic library
A Gentle Tutorial on the EM Algorithm and its Application to Parameter
Estimation for Gaussian Mixture and Hidden Markov Models by Jeff A. Bilmes
Harmonising chorales in the style of Johann Sebastian Bach
by Moray Allan using a HMM (MSc, School of Informatics, Edinburgh,
2002). See also HMM Bach demo
Time Series Modelling
Web resource Movie
clips from Prof Andrew Blake's group
illustrating tracking with non-linear
Time Series Modelling and Kalman Filters ctd,
Junction tree algorithm
answer sheet ans6_1112.pdf
Handout Worked example of
inference in a junction tree
Worked example for c->b->a network
NO LECTURE ON FRIDAY 16 NOV
Undirected graphical models
answer sheet ans7_1112.pdf
Working for local Markov
Working for Boltzmann machine
Working for derivative of a
from MSR Cambridge
Last lecture on Tues 20 Nov.
Finish off Undirected graphical models, followed by
question and answer session. If there is time I will then
discuss Coding and Information Theory (not examinable)
answer sheet ans8_1112.pdf
I will hold a revision session on Thurs 6 Dec 10-11am in the
Hugh Robson Building
This will be a question and answer format, not a lecture.
This page is maintained by Chris