There has a recent surge of computational architectures inspired by how the brain works, and unlike previously, they actually perform well on certain tasks. At the same time advanced computational analysis is increasingly used to analyse and understand neural data. This course covers both these developments. After describing rigorous ways to describe neural activity mathematically, and introducing methods how high-dimensional neural activity patterns can be represented and modelled, we present a number of the architectures recently used to do tasks like image understanding, memory and cognition, as well as some brain inspired hardware implementations.

The background needed to successfully take this course is a good grounding in mathematics, particularly with regard to probability and statistics, vectors and matrices. The mathematical level required is similar to that which would be obtained by students who did not have significant difficulties with the courses Mathematics for Informatics 1-4 taken in the first two years of the Informatics undergraduate syllabus. The Neural Computation (NC) course is a helpful but not necessary prerequisite, as biological realism is not such an important objective as in the NC course. Machine learning courses (LfD, PMR, IAML) will be also useful preparations.

Instructors:
Mark van Rossum and Matthias Hennig.

Course Tutor: TBA

No lecture on Monday Jan 16th. The first lecture is on Thursday Jan 19th.

- Overview of relevant neurobiology
- Review of probabilistic modelling and information theory.
- Neural encoding
- Neural decoding
- Information theory
- Statistical models of neuronal networks
- Neuro-inspired architectures
- Neural computation with attractors
- Discussion of class papers

There will be two assessed assignments worth in total 25%. There will be an exam worth 75%.

First assignment . Deadline March 27th, 2017.

Second assignment,
Deadline April 4th, 2017

Theoretical Neuroscience by P Dayan and L F Abbott (MIT Press 2001) is recommended reading, see also the list of errata.

Neuronal Dynamics by Wulfram Gerstner, Werner M. Kistler, Richard Naud and Liam Paninski. Full version online.

Natural Image Statistics by Aapo Hyvarinen, Jarmo Hurri, and Patrik O. Hoyer. Full version online.

Information Theory, Inference and Learning Algorithms by David MacKay \item Introduction To The Theory Of Neural Computation, Volume I, by John Hertz.

Neural decoding: lecture slides, lecture slides (single page)

Networks for visual processing: lecture slides, lecture slides (single page)

Lecture 1: slides

Lecture 2: slides

Mean constraint derivation.

Hopfield model: slides

Tutorial session: 16 March, 9am (usual place).

Deep models: lecture slides

*This page is maintained by
Mark van Rossum *.

Informatics Forum, 10 Crichton Street, Edinburgh, EH8 9AB, Scotland, UK
Tel: +44 131 651 5661, Fax: +44 131 651 1426, E-mail: school-office@inf.ed.ac.uk Please contact our webadmin with any comments or corrections. Logging and Cookies Unless explicitly stated otherwise, all material is copyright © The University of Edinburgh |