Informatics Report Series



Related Pages

Report (by Number) Index
Report (by Date) Index
Author Index
Institute Index

Title:PAC-Bayesian Generalization Error Bounds for Gaussian Process Classification
Authors: Matthias Seeger
Date:Mar 2002
Approximate Bayesian Gaussian process (GP) classification techniques are powerful nonparametric learning methods, similar in appearance and performance to Support Vector machines. Based on simple probabilistic models, they render interpretable results and can be embedded in Bayesian frameworks for model selection, feature selection, etc. In this paper, by applying the PAC-Bayesian theorem of \cite{mcallester:99}, we prove distribution-free generalization error bounds for a wide range of approximate Bayesian GP classification techniques. We instantiate and test these bounds for two particular GPC techniques, including a sparse method which circumvents the unfavourable scaling of standard GP algorithms. As is shown in experiments on a real-world task, the bounds can be very tight for moderate training sample sizes. To the best of our knowledge, these results provide the tightest known distribution-free error bounds for approximate Bayesian GPC methods, giving a strong learning-theoretical justification for the use of these techniques.
2002 by The University of Edinburgh. All Rights Reserved
Links To Paper
No links available
Bibtex format
author = { Matthias Seeger },
title = {PAC-Bayesian Generalization Error Bounds for Gaussian Process Classification},
year = 2002,
month = {Mar},

Home : Publications : Report 

Please mail <> with any changes or corrections.
Unless explicitly stated otherwise, all material is copyright The University of Edinburgh