This is a 10 point course for MSc level students. Undergraduate students in their fourth year should ask for permission to take the course, if they feel they have sufficient background in mathematics. It runs in semester 2.

Course catalogue entryThere has a recent surge of computational architectures inspired by how the brain works, and unlike previously, they actually perform very well on certain tasks. At the same time advanced computational analysis is increasingly used to analyse and understand neural data. This course covers both these developments. After describing rigorous ways to describe neural activity mathematically, and introducing methods how high-dimensional neural activity patterns can be represented and modelled, we present a number of the architectures recently used to do tasks like image understanding, memory and cognition, as well as some brain inspired hardware implementations.

The background needed to successfully take this course is a good grounding in mathematics, particularly with regard to probability and statistics, vectors and matrices. The mathematical level required is similar to that which would be obtained by students who did not have significant difficulties with the courses Mathematics for Informatics 1-4 taken in the first two years of the Informatics undergraduate syllabus. The Neural Computation (NC) course is a helpful but not necessary prerequisite, as biological realism is not such an important objective as in the NC course. Machine learning courses (PMR, IAML, MLPR) will be also useful preparation, and complement the material covered here.

Instructor:
Matthias Hennig.

- Overview of relevant neurobiology
- Review of probabilistic modelling and information theory.
- Neural encoding
- Neural decoding
- Information theory
- Statistical models of neuronal networks
- Neuro-inspired architectures
- Neural computation with attractors
- Discussion of class papers

There will be two assessed assignments worth in total 25%. There will be an exam worth 75%.

First assignment (2018, will be updated for the 2019 session). Use this Matlab/Octave code to generate spikes.

Second assignment (2018, will be updated for the 2019 session)

Theoretical Neuroscience by P Dayan and L F Abbott (MIT Press 2001) is recommended reading, see also the list of errata.

Neuronal Dynamics by Wulfram Gerstner, Werner M. Kistler, Richard Naud and Liam Paninski. Full version online.

Natural Image Statistics by Aapo Hyvarinen, Jarmo Hurri, and Patrik O. Hoyer. Full version online.

Information Theory, Inference and Learning Algorithms by David MacKay

Introduction To The Theory Of Neural Computation, Volume I by John Hertz.

Neural decoding: lecture slides, lecture slides (single page)

Lecture 1: slides

Lecture 2: slides

Mean constraint derivation.

Hopfield model: slides

Deep models: lecture slides

*This page is maintained by
Matthias Hennig
*.

Informatics Forum, 10 Crichton Street, Edinburgh, EH8 9AB, Scotland, UK
Tel: +44 131 651 5661, Fax: +44 131 651 1426, E-mail: school-office@inf.ed.ac.uk Please contact our webadmin with any comments or corrections. Logging and Cookies Unless explicitly stated otherwise, all material is copyright © The University of Edinburgh |