Informatics Report Series


Report   

EDI-INF-RR-0206


Related Pages

Report (by Number) Index
Report (by Date) Index
Author Index
Institute Index

Home
Title:Variational Information Maximization in Gaussian Channels
Authors: Felix Agakov ; David Barber
Date:Apr 2004
Abstract:
Recently, we introduced a simple variational bound on mutual information, that resolves some of the difficulties in the application of information theory to machine learning. Here we study a specific application to Gaussian channels. It is well known that PCA may be viewed as the solution to maximizing information transmission between a high dimensional vector x and its low dimensional representation y. However, such results are based on assumptions of Gaussianity of the sources x. In this paper, we show how our mutual information bound, when applied to this arena, gives PCA solutions, without the need for the Gaussian assumption. Furthermore, it naturally generalizes to providing an objective function for Kernel PCA, enabling the principled selection of kernel parameters.
Copyright:
2004 by The University of Edinburgh. All Rights Reserved
Links To Paper
No links available
Bibtex format
@Misc{EDI-INF-RR-0206,
author = { Felix Agakov and David Barber },
title = {Variational Information Maximization in Gaussian Channels},
year = 2004,
month = {Apr},
}


Home : Publications : Report 

Please mail <reports@inf.ed.ac.uk> with any changes or corrections.
Unless explicitly stated otherwise, all material is copyright The University of Edinburgh