Welcome! This course examines the principles, methods and application of machine learning: the business of utilising data to enable computers to perform tasks that would be infeasible to do well by direct programming. The course aims to provide a firm grounded understanding of the field and a knowledge of the methods that can be used to design machine learning tools. Alongside these fundamentals, I will teach about the theory and practice of a core set of machine learning tools, and students will gain practical experience using these tools. Please see this handout for more detailed information.
Students with little mathematical background would be well advised to do some preparatory work before the course begins.
Some of you may struggle with the mathematics in this course. Here is more information about the mathematical background assumed.
We will be using Nota Bene as the discussion forum for this course. You can sign up for the forum here. The TA and I will be monitoring the site regularly. I urge you to use the discussion site to ask any questions about the course.
To encourage you to use the site, we will have several special policies:
Although NB supports Firefox, Chrome, Safari, and IE9, some students have reported problems this year with certain browsers. If you are having technical trouble, try logging in with Chrome. Bugs can be reported on the NB discussion forum. In my experience the NB team are very responsive.
The course text will be Machine Learning: A Probabilistic Perspective. Kevin P Murphy, MIT Press, 2012.
A similar recent book, which is also excellent, is Bayesian Reasoning and Machine Learning. by David Barber, 2012. It is available as a free pdf online.
Either one of the above two books is sufficent to do well in this course. For those who want to read even more, I'd also suggest:
David MacKay's book, "Information Theory, Inference and Learning Algorithms" is a classic book that gives a different perspective on many of the methods in this course. It can be read online for free.
Finally, the book The Elements of Statistical Learning: Data Mining, Inference, and Prediction, Hastie, Tibshirani, and Friedman. (PDF available online.) is a perspective on machine learning by several leading statisticians. It is very much complementary to the course texts.
A mock exam paper is available. The length of this exam paper is a little longer than the real one would be, so you should not be expected to do this in the time available. You should use this paper for immediate study to familiarize yourself with the types of questions you might see and the newer subjects covered. I will also try to broadly follow the same types of questions used in previous years' exam papers.
Informatics Forum, 10 Crichton Street, Edinburgh, EH8 9AB, Scotland, UK
Tel: +44 131 651 5661, Fax: +44 131 651 1426, E-mail: firstname.lastname@example.org
Please contact our webadmin with any comments or corrections. Logging and Cookies
Unless explicitly stated otherwise, all material is copyright © The University of Edinburgh