from data

You are expected to attend the lectures. LfD is really about a way of thinking, and perhaps a different way of thinking than is found on many other informatics courses. The lectures are likely to provide more insight into that than any other resource. In order to save you time taking verbatim notes from the lectures, the lecture slides will be provided here. In addition more detailed notes will be provided at the beginning of the course. Electronic copies of the notes are available here too. Thanks to David Barber for allowing his notes to continue to be used for this course.

Lectures notes:

Introduction

The matlab introductory notes

Supplementary maths notes

Mathematics and Data

Dimensionality Reduction

Density Estimation: Gaussian

Nearest Neighbour Methods

Naive Bayes

Logistic Regression

Linear Parameter Models

Generalisation

Layered Neural Networks

Adaptive Basis Functions

Mixture Models

Real World Considerations

Lecture slides (if you print these I recommend printing them 4 to a page. Do not forget to ensure you are printing landscape):

lecture 1: introduction

lecture 2: mathematical preliminaries

lecture 3: thinking about data

lecture 4: density estimation: maximum likelihood

lecture 5: density estimation: Gaussian

lecture 6: dimensionality reduction

lecture 7: nearest neighbour methods

lecture 8: naive Bayes

lecture 9: visualization 1

lecture 10: logistic regression: model

lecture 11: logistic regression: learning

lecture 12: regression models

lecture 13: generalisation

lecture 14: multilayered perceptrons

lecture 15: multilayered perceptrons 2

lecture 16: multilayered perceptrons 3

lecture 17: adaptive basis functions

lecture 18: gaussian mixture models

lecture 19: Dealing with real data

The eigenfaces demo is available. The MATLAB program is facepca.m and the face data is available as faces.mat. The face data is from The AR Face Database form Purdue, and they hold all the copyrights. Used with permission.

No single book covers the material for the course. Fairly detailed lectures notes will be provided for the course. However you may find it useful to refer to the texts below for different presentations of some of the material. I do not recommend that you buy these books.

Bishop: Pattern Recognition and Machine Learning. This is a comprehensive book that goes into more details than this course. It covers Bayesian methods much more thoroughly.

Machine Learning by Tom Mitchell. Here is a list of errata for the book obtained from Tom Mitchell's pages.

Another very useful book is "Neural Networks for Pattern Recognition" by Chris Bishop, Oxford University Press.

David MacKay's book, "Information Theory, Inference and Learning Algorithms" is fantastic, and can be read online for free. You certainly do not need to read all the book for the course! The chapters on neural networks are interesting, and also the basic material on probability. You can download chapters individually. You can also buy the book.

Informatics Forum, 10 Crichton Street, Edinburgh, EH8 9AB, Scotland, UK
Tel: +44 131 651 5661, Fax: +44 131 651 1426, E-mail: school-office@inf.ed.ac.uk Please contact our webadmin with any comments or corrections. Logging and Cookies Unless explicitly stated otherwise, all material is copyright © The University of Edinburgh |