- Abstract:
-
Whenever a graphical model contains connections from multiple nodes to a single node, statistical inference of model parameters may require the evaluation and possibly the inversion of the covariance matrix of all variables contributing to such a fan-in, particularly in the context of regression and classification. Thus, for high dimensional fan-ins, statistical inference can become computationally rather expensive and numerically brittle. In this paper, we propose an EM-based estimation method that statistically decouples the inputs by the introduction of hidden variables in each branch of the fan-in. As a result, the algorithm has a per-iteration complexity that is only linear in the order of the fan-in. Interestingly, the resulting algorithm can be interpreted as a probabilistic version of backfitting, and consequently, is ideally suited for applications of backfitting that require to cleanly propagate probabilities, as in Bayesian inference. We demonstrate the effectiveness of Bayesian Backfitting in dealing with extremely high-dimensional, underconstrained regression problems. In addition we highlight its connection to probabilistic partial least squares regression, and its extensions to nonlinear datasets through variational Bayesian mixture of experts regression, and nonparametric locally weighted learning.
- Copyright:
- 2004 by The University of Edinburgh. All Rights Reserved
- Links To Paper
- No links available
- Bibtex format
- @InProceedings{EDI-INF-RR-0223,
- author = {
Sethu Vijayakumar
and Aaron D'Souza
and Stefan Schaal
},
- title = {Bayesian Backfitting for High Dimensional Regression},
- book title = {Proc. 21st Intl .Conf. on Machine Learning (ICML'04), Article No. 31, Banff, Canada, Jul 4-8 (2004).},
- year = 2004,
- }
|