Lecture 9, Friday w5, 2014-10-17 ================================ Things we covered: * Reflected on arithmetic coding. * Encoder and decoder ask probabilistic model for same sequence of probability distributions, but don't emit and consume bytes at the same times. * Arithmetic coding is easily adapted to use output alphabets other than {0,1}, and to use symbols with unequal probabilities. * Bayes rule is for inferring explanation behind data. * Predictions: consider predictions you'd make if you knew everything. Weight predictions under different assumptions by their plausibility given the data (from Bayes rule). * Same sequence of maths for card puzzle, and for predicting in sparse file model. And any prediction problem. * Beta distribution as a standard distribution on real numbers in [0,1]. It has two parameters, $\alpha$ and $\beta$. If we identify them, we know the whole distribution, including its the mean. Applying this lecture --------------------- Try to be formal when answering inference and prediction questions in the tutorial exercises. Even if things seem easy, try to write out the full general probabilistic reasoning as in class. Recommended reading ------------------- We've now half way through the `week 5' slides. Mark anything that's unclear or that needs expanding on NB. As pointed out in the last lecture log, arithmetic coding is covered in MacKay pp110--116. Inference and prediction with an unknown Bernoulli distribution is in section 3.2, pp51--52 of MacKay. You could always read more of Chapter 3 if keen.