Neural Computation 2017-2018

Neural Computation (NC) is a 10 point MSc course of 18 lectures in the first semester.
(Other students can attend after agreement).
Lectures will at 12.10 on Tuesday and Friday. We start on time!
For the location look at the official timetable
The first lecture will be in week 1 on September 19th 2017.

Instructor: Mark van Rossum
No lectures on: Oct 3, Oct 31, Nov 17.
No practical on: TBC

Short description

In this course we study the computations carried out by the nervous system. Unlike most courses and artificial intelligence, we take a bottom-up approach. This means that we incorporate data from neurobiology, simulate certain aspects of it, and try to formulate theories about the brain.
Apart from learning about the brain, you will also learn about numerical modelling of differential equations, non-linear dynamics, current neurbiological research and pitfalls in modelling real-world systems.

For whom is this course?
This course should appeal to students who are interested in the basic principles and 'biological hardware' implementations of computation in human and animals brains.

For whom is it not?
The topics discussed in the course have inspired many machine learning solutions to real-life problems, however, we shall hardly discuss those. It should also be noted that the course has little direct practical applicability outside academic research.

Keywords: single neuron models, small networks, neural codes, models of learning and synaptic plasticity.

There are extensive Lecture notes , during the lectures we go linearly through them. As for any course, preparatory reading of the notes before each lecture is recommended.
Slides for Learning and memory (upto slide 64 only)

Office hours: make an appointment or catch me after the lecture.


No prior biology/neuroscience knowledge is required. I use a small subset of not very advanced math in the lectures. Keywords: linear differential equations, eigenvectors, Fourier transformations, fixed points. These older FMCS lecture notes can be used as a refresher. Alternatively, use Google to refresh forgotten maths if needed. If you are still stuck, use the practicals or office hours to resolve the problems.

In the tutorials we use MatLab and NEURON (a special purpose simulator). No prior experience with either is required, however MatLab skills are valuable for many courses.

More information on Matlab and how to make graphs and write reports.


The course will be assessed by one piece of coursework and an exam. There will also be a formative (non-assessed) coursework. Standard late policies will apply. Also see How to make graphs and write reports.

Assignment 1 (non-assessed). Deadline: Nov 3rd
Note to answers assignment 1

Assignment 2 BCM Deadline: Dec 4st, 4pm
Note to answers assignment 2


Practicals are every week, starting week 2. Time and location: 9.00-9.50 in AT 4.12. You can use the practicals to work on the exercises below, and ask questions about the lectures. Attendance is not obligatory.

Timetable (approximate)

Week 1; week of Sep 18
Tuesday lecture: 1. Introduction and Chapter 1: Anatomy 
Friday lecture: 2. Chapter 2: Passive properties.
No practical.

Week 2; week of Sep 25
Tuesday lecture: 3. Chapter 3: Hodgkin-Huxley
Friday lecture: 4. Chapter 3: Hodgkin-Huxley
Practical: 1. The NEURON simulator: Passive properties

Week 3; week of Oct 3
NO Tuesday lecture:
Friday lecture:     6. Chapter 4: Synapses
Practical: 2. The NEURON simulator: Hodgkin-Huxley model

Week 4; week of Oct 9
Tuesday lecture: 7. Chapter 5: Integrate and Fire models
NO Friday lecture:
Practical: 4. Matlab: AMPA receptor simulation. Script: ampa.m

Week 5; week of Oct 16
Tuesday lecture: 8. Chapter 6: Firing statistics
Friday lecture: 9. Chapter 7: Retina and LGN
Practical: 5. Matlab: An Integrate and fire neuron  Script: mvr_if_matlab.m

Week 6; week of Oct 23
Tuesday lecture: 11 Chapter 8: Higher visual processing
Friday lecture: 12. Chapter 9: Coding
Practical: Work on assigment and /or question 7 and 8 of 6. Simple and complex cells (accompanying Dayan and Abbott chapter: encode2.pdf)

Week 7; week of Oct 30
NO Tuesday lecture:
Friday lecture: 14. Chapter 11: Spiking networks
Practical: Work on assigment and /or question 7 and 8 of 6. Simple and complex cells (accompanying Dayan and Abbott chapter: encode2.pdf)

Week 8; week of Nov 6
Tuesday lecture: 15. Chapter 13: Hebbian Learning
Friday lecture: 15. Chapter 13: Hebbian Learning
Practical: Practical: Information in spike trains.

Week 9; week of Nov 13
Tuesday lecture: 16. Chapter 13: Spike timing dep. Hebbian Learning
NO Friday lecture:
Practical: No practical

Week 10; week of Nov 20
Tuesday lecture: 17. Learning and memory
Friday lecture: 17. Learning and memory
Slides for Learning and memory
Practical: 7. Matlab: Ben-Yishai network Script: ben2.m or work on assigment.

Week 11; week of Nov 27
Tuesday lecture: 17. Learning and memory
Friday lecture: Spill over.
Practical: 8. Matlab: Hebbian learning with constraints Script (will appear later): hebb.m or work on assigment.

Additional material (discussed in the lectures):

Movies of LGN and V1 recordings (play with mplayer under linux):


Recurrent 6-node network with chaotic behavior bifur6.m

Home : Teaching : Courses 

Informatics Forum, 10 Crichton Street, Edinburgh, EH8 9AB, Scotland, UK
Tel: +44 131 651 5661, Fax: +44 131 651 1426, E-mail:
Please contact our webadmin with any comments or corrections. Logging and Cookies
Unless explicitly stated otherwise, all material is copyright © The University of Edinburgh