Neural Computation 2019-2020

Neural Computation (NC) is a 10 point course of 18 lectures in the first semester. It is suitable for 4th year undergraduate students and MSc students.

If you are interested but unsure if you can attend, please contact the course lecturers.

Lectures will at 12.10 on Tuesday and Friday. The Tuesday session is in G.3 Bayes, and the Friday session in David Hume Tower, room LG.11

The first lecture will be in week 1 on September 17th 2019.

Instructors: Peggy Series and Matthias Hennig

Short description

In this course we study the computations carried out by the nervous system. Unlike most courses and artificial intelligence, we take a bottom-up approach. This means that we incorporate data from neurobiology, simulate certain aspects of it, and try to formulate theories about the brain.
Apart from learning about the brain, you will also learn about numerical modelling of differential equations, non-linear dynamics, current neurobiological research and pitfalls in modelling real-world systems.

For whom is this course?
This course should appeal to students who are interested in the basic principles and 'biological hardware' implementations of computation in human and animal brains. It complements other courses in the cognitive sciences, offering a more biological perspective. The CCN course is an excellent follow-up, exploring higher level concepts of perception and cognition.
For whom is it not?
The topics discussed in the course have inspired many machine learning solutions to real-life problems, however, we shall hardly discuss those. These links are explored in more detail in the NIP course (note not running in 2019/20), for which this course is a good preparation. It should also be noted that the course has limited direct practical applicability outside academic research.

Keywords: single neuron models, small networks, neural codes, models of learning and synaptic plasticity.

There are extensive Lecture notes, during the lectures we go linearly through them. As for any course, preparatory reading of the notes before each lecture is recommended. Note that these notes contain extra material for context. Only material covered in class will be examined.

Office hours: make an appointment via email or catch us after the lecture.

Prerequisites

No prior biology/neuroscience knowledge is required. We use a small subset of not very advanced math in the lectures. Keywords: linear differential equations, eigenvectors, Fourier transformations, fixed points. These older FMCS lecture notes can be used as a refresher. Alternatively, use Google to refresh forgotten maths if needed. If you are still stuck, use the practicals or office hours to resolve the problems.

In the tutorials we use Matlab/Octave and NEURON (a special purpose simulator). No prior experience with either is required, however Matlab skills are valuable for many courses.

More information on Matlab and how to make graphs and write reports.

Assessment

The course will be assessed by one piece of coursework and an exam. There will also be a formative (non-assessed) coursework. Standard late policies will apply. Also see How to make graphs and write reports.
Assignment 1 (non-assessed). Deadline: 4pm Nov 1 2019 - Required file: noisy_iclamp.mod
Assignment 2 (assessed). Deadline: 4pm Nov 22 2019

Lab sessions

Practical lab sessions are every week, starting week 3. Time and location: 14.10-15.00 in AT 6.06, startig in week 3. You can use the practicals to work on the exercises below, and ask questions about the lectures. Attendance is not compulsory, but highly recommended.

Timetable (approximate, see the Learn page for up to date info)

Week 1
Tuesday lecture: 1. Introduction and Chapter 1: Anatomy.
Friday lecture: 2. Chapter 2: Passive properties.
No practical.

Week 2
Tuesday lecture: 3. Chapter 3: Hodgkin-Huxley
Friday lecture: 4. Chapter 3: Hodgkin-Huxley
Practical: 1. The NEURON simulator: Passive properties

Week 3

6. Chapter 4: Synapses
Practical: 2. The NEURON simulator: Hodgkin-Huxley model

Week 4
7. Chapter 5: Integrate and Fire models
Practical: 3. Matlab: AMPA receptor simulation. Script: ampa.m

Week 5
Tuesday lecture: 8. Chapter 6: Firing statistics
Friday lecture: 9. Chapter 7: Retina and LGN
Practical: 4. Matlab: An Integrate and fire neuron  Script: NClab4.m

Week 6
Tuesday lecture: 11 Chapter 8: Higher visual processing
Friday lecture: 12. Chapter 9: Coding
Practical: 5. Simple and complex cells in V1 (based on question 7 and 8 of Dayan and Abbot , accompanying Dayan and Abbott chapter: encode2.pdf)

Week 7

14. Chapter 11: Spiking networks
Practical: 6. Simple and complex cells in V1 (based on question 7 and 8 of Dayan and Abbot , plus temporal response, accompanying Dayan and Abbott chapter: encode2.pdf)

Week 8
Tuesday lecture: 15. Chapter 13: Hebbian Learning
Friday lecture: 15. Chapter 13: Hebbian Learning
Practical: 7. Information in spike trains.

Week 9
Tuesday lecture: 16. Chapter 13: Spike timing dep. Hebbian Learning
tbd:
Practical: 8. Matlab: Ben-Yishai network.

Week 10
Tuesday lecture: 17. Learning and memory
Friday lecture: 17. Learning and memory

Practical: 9. Matlab: Hebbian learning.

Week 11
Tuesday: no lecture
Friday lecture: Guest lecture: Barbara Webb on Modelling the neural basis of insect navigation.
Lab: tbd

Additional material (discussed in the lectures)

Movies of LGN and V1 recordings (play with mplayer under linux):

hubel_Wiesel_lgn_off_cell.asf
hubel_wiesel_binocular_cell.asf
hubel_wiesel_complex.asf
hubel_wiesel_directional_cell.asf
hubel_wiesel_lgn_on_cell.asf
hubel_wiesel_simple_cell.asf

Recurrent 6-node network with chaotic behavior bifur6.m


Home : Teaching : Courses 

Informatics Forum, 10 Crichton Street, Edinburgh, EH8 9AB, Scotland, UK
Tel: +44 131 651 5661, Fax: +44 131 651 1426, E-mail: school-office@inf.ed.ac.uk
Please contact our webadmin with any comments or corrections. Logging and Cookies
Unless explicitly stated otherwise, all material is copyright © The University of Edinburgh