Neural Information Processing: Course homepage 2015-2016

This is a course for MSc level students. It runs in semester 2.

There has a recent surge of computational architectures inspired by how the brain works, and unlike previously, they actually perform well on certain tasks. At the same time advanced computational analysis is increasingly used to analyse and understand neural data. This course covers both these developments. After describing rigorous ways to describe neural activity mathematically, we present a number of the architectures recently used to do tasks like object recognition, memory, bayesian sampling, and cognition, as well as some hardware brain inspired hardware ideas.

The background needed to successfully take this course is a good grounding in mathematics, particularly with regard to probability and statistics, vectors and matrices. The mathematical level required is similar to that which would be obtained by students who did not have significant difficulties with the courses Mathematics for Informatics 1-4 taken in the first two years of the Informatics undergraduate syllabus. The Neural Computation (NC) course is a helpful but not necessary prerequisite, as biological realism is not such an important objective as in the NC course. Machine learning courses (LfD, PMR, IAML) will be also useful preparations.

Instructors: Mark van Rossum and Matthias Hennig.
Course Tutor: TBA


Monday and Thursday 9.00-9.50. LT2 Appleton Tower
First lecture: Monday Jan 11th.

Course Outline

Class papers

Click this link. This list might be updated.


There will be two assessed assignments worth in total 25%. There will be an exam worth 75%.

First assignment . Deadline March 16th, 2016.

Data file for first assignment

Second assignment, Deadline April 5th, 2016

Old assigments for practice: A1:12-13, A1:11-12, A1:10-11, A1:09-10, A1:08-09


See also references in the lecture notes.
Theoretical Neuroscience by P Dayan and L F Abbott (MIT Press 2001) is recommended reading, see also the list of errata.
Natural Image Statistics by Aapo Hyvarinen, Jarmo Hurri, and Patrik O. Hoyer. Full version online.
Neuronal Dynamics by Wulfram Gerstner, Werner M. Kistler, Richard Naud and Liam Paninski Full version online.

Week-by-week listing (approximate)

Week 1

Lectures: Introduction: lecture slides (2x2) , lecture slides (single page)
Neural encoding: lecture slides (2x2) , lecture slides (single page)

Week 2

Lectures: Neural encoding (cont.)

Week 3

Neural decoding: lecture slides, lecture slides (single page)

Week 4

Lectures: Information theory: lecture slides, lecture slides (single page)

Week 5

Information theory (cont.).

Week 6

Innovative teaching week. No lectures.

Week 7

Lectures: Predicting Retinal Ganglion Cell Receptive Fields, slides (single page), slides (4up)
Higher Order Statistics slides (single page), slides (4up),
Background reading on Fourier analysis: Fourier series, Fourier transform

Week 8

Lectures Learning and networks slides (single page), slides (4up),

Week 9

Lectures Object recognition

Week 10

Class papers

Week 11

Week 12


This page is maintained by Mark van Rossum .

Home : Teaching : Courses 

Informatics Forum, 10 Crichton Street, Edinburgh, EH8 9AB, Scotland, UK
Tel: +44 131 651 5661, Fax: +44 131 651 1426, E-mail:
Please contact our webadmin with any comments or corrections. Logging and Cookies
Unless explicitly stated otherwise, all material is copyright © The University of Edinburgh