Publication | Open Access
Fast Forward Selection to Speed Up Sparse Gaussian Process Regression
389
Citations
8
References
2003
Year
We present a method for the sparse greedy approximation of Bayesian Gaussian process regression, featuring a novel heuristic for very fast forward selection motivated by active learning. We show how a large number of hyperparameters can be adjusted automatically by maximizing the marginal likelihood of the training data. Our method is essentially as fast as an equivalent one which selects the "support" patterns at random, yet has the potential to outperform random selection on hard curve fitting tasks and at the very least leads to a more stable behaviour of first-level inference which makes the subsequent gradient-based optimization of hyperparameters much easier. In line with the development of our method, we present a simple view on sparse approximations for GP models and their underlying assumptions and show relations to other methods.
| Year | Citations | |
|---|---|---|
Page 1
Page 1