Neural Computation 2015-2016
Neural Computation (NC) is a 10 point MSc course of 18 lectures in
the first semester.
(Other students can attend after agreement).
NOTE CHANGE OF LOCATION: Tuesday: G.02, 16-20 George Square
We start on time!
The first lecture will be in week 1
on September 22th 2014.
In this course we study the computations carried out by the nervous
system. Unlike most courses and artificial intelligence, we take
a bottom-up approach. This means that we incorporate data from
neurobiology, simulate certain aspects of it, and try to formulate
theories about the brain.
Apart from learning about the brain, you will also learn about numerical modelling
of differential equations, non-linear dynamics, current neurbiological research and
pitfalls in modelling real-world systems.
For whom is this course?
This course will appeal to students who are interested in the basic
principles and 'biological hardware' implementations of
in human and animals brains.
For whom is it not?
The topics discussed in the course have inspired software solutions
to real-life problems, however, we shall hardly
discuss those. The course has little practical applicability outside
models, small networks, neural codes, models of learning and
Lecture notes part1
, and part2
Office hours: make an appointment or catch me after the lecture.
No prior biology/neuroscience knowledge is required. I use a small
subset of not very advanced math in the lectures. Keywords: linear differential equations, eigenvectors, Fourier transformations, fixed points.
FMCS lecture notes
can be used as a refresher.
Alternatively, use Google to refresh forgotten maths if needed. If
you are still stuck, use the practicals or office hours to resolve
In the tutorials we use MatLab and NEURON (a special purpose
simulator). No prior experience with either is required, however
MatLab skills are valuable for many courses.
More information on Matlab and how
to make graphs and write reports.
The course will be fully assessed by two reports of practical
assignments which will appear here. The two marks are averaged.
Standard late policies will apply. Also see How to make graphs and write reports.
Assignment 1 Deadline: 30 Oct 4pm
Note to answers assignment 1
AMPA mod file
NMDA mod file
Assignment 2 Deadline: 4 Dec 4pm
Note to answers assignment 2
Preferably hand-in hardcopy at ITO, otherwise email to
Practicals are every week, starting week XX. Time and location TBA
No practicals in the first weeks.
You can use the practicals to work on the
exercises below, and ask questions about the lectures. Attendance is
Week 1; week of Sep 21
Tuesday lecture: 1. Introduction and Chapter 1: Anatomy
Friday lecture: 2. Chapter 2: Passive properties.
Week 2; week of Sep 28
Tuesday lecture: 3. Chapter 3: Hodgkin-Huxley
Friday lecture: 4. Chapter 3: Hodgkin-Huxley
Practical: 1. The NEURON simulator: Passive
Week 3; week of Oct 5
Tuesday lecture: 5. Chapter 4: Synapses
Friday lecture: 6. Chapter 4: Synapses
Practical: 2. The NEURON simulator:
Week 4; week of Oct 12
Tuesday lecture: 7. Chapter 5: Integrate and Fire models
Friday lecture: 8. Chapter 6: Firing statistics
Practical: 4. Matlab: AMPA receptor simulation.
Week 5; week of Oct 19
Tuesday lecture: 9. Chapter 7: Retina and LGN
NO Friday lecture:
Practical: 5. Matlab: An Integrate and fire
neuron Script: mvr_if_matlab.m
Week 6; week of Oct 26
Tuesday lecture: 11 Chapter 8: Higher visual processing
Friday lecture: 12. Chapter 9: Coding
Week 7; week of Nov 2
Tuesday lecture: 13. Chapter 10: Networks
Friday lecture: 14. Chapter 11: Spiking networks
Practical: Question 7 and 8 of 6. Simple
and complex cells (accompanying Dayan and Abbott chapter: encode2.pdf)
Week 8; week of Nov 9
NO Tuesday lecture:
Friday lecture: 15. Chapter 13: Hebbian Learning
Practical: 7. Matlab: Ben-Yishai network
Week 9; week of Nov 16
Tuesday lecture: 16. Chapter 13: Spike timing dep. Hebbian Learning
Friday lecture: 17. Chap 12: Decisions
Practical: NO Practical
Week 10; week of Nov 23
Tuesday lecture: 17. Chapter 13: Spike timing dep. Hebbian Learning
Friday lecture: Spill over.
Practical: 8. Matlab: Hebbian learning
with constraints Script (will appear later): hebb.m
Additional material (discussed in the lectures):
Movies of LGN and V1 recordings (play with mplayer under linux):
Recurrent 6-node network with chaotic behavior bifur6.m