Lecture 4, Friday w2, 2014-09-26

This lecture was fast. I expect most people have to go over the Source Coding Theorem themselves more slowly—more than once—before they get it. I know I did. The next few lectures are slower. And there’s only one lecture next week. Make some time to work through the material so far.

Main lecture points:

Check your progress

Things you should be able to do, and questions to think about after after reviewing the lecture:

We have finished the ‘week 2’ slides, except detail I skipped for the binary entropy function, and the numerical note on log(sum(exp(·))). I will return to those things I skipped. Ask on NB if anything else is unclear, or too compressed.

MacKay Chapter 4 gives more details on the Source Coding Theorem, so I recommend reading the rest of Chapter 4. MacKay’s notation is slightly different. For example, I indexed the typical set with m, the number of standard deviations away from the mean. MacKay uses β, which absorbs the standard deviation of h(x). Also, I didn’t use the symbols Hδ or Sδ. IT exam questions won’t require you to recall the detail of notation from my proof or MacKay’s. However, you do need to understand the ideas behind the proof. You may have to explain parts of the argument, or answer questions relating to it.


This page maintained by Iain Murray.
Last updated: 2014/09/26 14:43:39


Home : Teaching : Courses : It : 2014 : Log 

Informatics Forum, 10 Crichton Street, Edinburgh, EH8 9AB, Scotland, UK
Tel: +44 131 651 5661, Fax: +44 131 651 1426, E-mail: school-office@inf.ed.ac.uk
Please contact our webadmin with any comments or corrections. Logging and Cookies
Unless explicitly stated otherwise, all material is copyright © The University of Edinburgh