Neural Computation 2014-2015

Neural Computation (NC) is a 10 point MSc course of 18 lectures in the first semester.
(Other students can attend after agreement).
Lectures are Tuesday and Friday: 12.10-13.00, Tuesday and Friday in SR2.04 AT. We start on time!
The first lecture will be in week 1 on September 16th 2014.
No lectures on Nov 14th and 18th. No tutor session on the 15th.
Instructor: Mark van Rossum

Short description

In this course we study the computations carried out by the nervous system. Unlike most courses and artificial intelligence, we take a bottom-up approach. This means that we incorporate data from neurobiology, simulate certain aspects of it, and try to formulate theories about the brain.
Apart from learning about the brain, you will also learn about numerical modelling of differential equations, non-linear dynamics, current neurbiological research and pitfalls in modelling real-world systems.

For whom is this course?
This course will appeal to students who are interested in the basic principles and 'biological hardware' implementations of computation
in human and animals brains.

For whom is it not?
The topics discussed in the course have inspired software solutions to real-life problems, however, we shall hardly
discuss those. The course has little practical applicability outside academic research.

Keywords: single neuron models, small networks, neural codes, models of learning and synaptic plasticity.

Lecture notes part1, and part2

Office hours: make an appointment or catch me after the lecture.

Prerequisites

No prior biology/neuroscience knowledge is required. I use a small subset of not very advanced math in the lectures. Keywords: linear differential equations, eigenvectors, Fourier transformations, fixed points. These older FMCS lecture notes can be used as a refresher. Alternatively, use Google to refresh forgotten maths if needed. If you are still stuck, use the practicals or office hours to resolve the problems.

In the tutorials we use MatLab and NEURON (a special purpose simulator). No prior experience with either is required, however MatLab skills are valuable for many courses.

More information on Matlab and how to make graphs and write reports.

Assessement

The course will be fully assessed by two reports of practical assignments which will appear here. The two marks are averaged. Standard late policies will apply. Also see How to make graphs and write reports.

Assignment 1 Deadline: 24 Oct 4pm
Note to answers assignment 2

Assignment 2 Deadline: 28 Nov Note to answers assignment 2

Preferably hand-in hardcopy at ITO, otherwise email to mvanross@inf

Practicals

Practicals are every week, starting week 3. noon-1pm, location AT 5.08. No practicals in the first two weeks. Your lecturer is your tutor. You can use the practicals to work on the exercises below, and ask questions about the lectures. Attendance is not obligatory.

Timetable (approximate)

Week 1; week of Sep 13
Tuesday lecture: 1. Introduction and Chapter 1: Anatomy 
Friday lecture: 2. Chapter 2: Passive properties.
No practical.

Week 2; week of Sep 22
Tuesday lecture: 3. Chapter 3: Hodgkin-Huxley
Friday lecture: 4. Chapter 3: Hodgkin-Huxley
Practical: 1. The NEURON simulator: Passive properties

Week 3; week of Sep 29
Tuesday lecture:  5. Chapter 4: Synapses
Friday lecture:     6. Chapter 4: Synapses
Practical: 2. The NEURON simulator: Hodgkin-Huxley model

Week 4; week of Oct 6
Tuesday lecture: 7. Chapter 5: Integrate and Fire
Friday lecture: 8. Chapter 6: Firing statistics
Practical: 4. Matlab: AMPA receptor simulation. Script: ampa.m

Week 5; week of Oct 13
Tuesday lecture: 9. Chapter 7: Retina and V1
Friday lecture: 10 Chapter 7: Retina and V1
Practical: 5. Matlab: An Integrate and fire neuron  Script: mvr_if_matlab.m

Week 6; week of Oct 20
Tuesday lecture: 11 Chapter 8: Coding
Friday lecture: 12. Chapter 9: Higher visual processing
Practical: Question 7 and 8 of 6. Simple and complex cells Dayan and Abbott chapter: encode2.pdf

Week 7; week of Oct 27
Tuesday lecture: 13. Chapter 10: Networks
Friday lecture: 14. Chapter 11+12: Decisions
Practical: 6. Matlab: Ben-Yishai network Script: ben2.m

Week 8; week of Nov 3
Tuesday lecture: 15. Chapter 13: Hebbian Learning
Friday lecture: 16. Chapter 13: Hebbian Learning
Practical: 8. Matlab: Hebbian learning with constraints Script (will appear later): hebb.m

Week 9; week of Nov 10
Tuesday lecture: 17. Chapter 13: Spike timing dep. Hebbian Learning
Friday lecture: NO LECTURE Practical: Spill over.

Week 10; week of Nov 17
Tuesday lecture: NO LECTURE
Friday lecture: Spill over. Practical: NO Practical

Additional material (discussed in the lectures):

Movies of LGN and V1 recordings (play with mplayer under linux):

hubel_Wiesel_lgn_off_cell.asf
hubel_wiesel_binocular_cell.asf
hubel_wiesel_complex.asf
hubel_wiesel_directional_cell.asf
hubel_wiesel_lgn_on_cell.asf
hubel_wiesel_simple_cell.asf

Recurrent 6-node network with chaotic behavior bifur6.m


Home : Teaching : Courses 

Informatics Forum, 10 Crichton Street, Edinburgh, EH8 9AB, Scotland, UK
Tel: +44 131 651 5661, Fax: +44 131 651 1426, E-mail: school-office@inf.ed.ac.uk
Please contact our webadmin with any comments or corrections. Logging and Cookies
Unless explicitly stated otherwise, all material is copyright © The University of Edinburgh