Informatics Report Series


Report   

EDI-INF-RR-0120


Related Pages

Report (by Number) Index
Report (by Date) Index
Author Index
Institute Index

Home
Title:An Analysis of Contrastive Divergence Learning in Gaussian Boltzmann Machines
Authors: Chris Williams ; Felix Agakov
Date:May 2002
Abstract:
The Boltzmann machine (BM) learning rule for random field models with latent variables can be problematic to use in practice. These problems have (at least partially) been attributed to the negative phase in BM learning where a Gibbs sampling chain should be run to equilibrium. Hinton (1999, 2000) has introduced an alternative called contrastive divergence (CD) learning where the chain is run for only 1 step. In this paper we analyse the mean and variance of the parameter update obtained after $i$ steps of Gibbs sampling for a simple Gaussian BM. For this model our analysis shows that CD learning produces (as expected) a biased estimate of the true parameter update. We also show that the variance does usually increase with $i$ and quantify this behaviour.
Copyright:
2002 by The University of Edinburgh. All Rights Reserved
Links To Paper
No links available
Bibtex format
@Misc{EDI-INF-RR-0120,
author = { Chris Williams and Felix Agakov },
title = {An Analysis of Contrastive Divergence Learning in Gaussian Boltzmann Machines},
year = 2002,
month = {May},
}


Home : Publications : Report 

Please mail <reports@inf.ed.ac.uk> with any changes or corrections.
Unless explicitly stated otherwise, all material is copyright The University of Edinburgh