A Framework for Evaluating Approximation Methods for Gaussian Process Regression
Krzysztof Chalupka, Christopher K. I. Williams, Iain Murray; 14(10):303−331, 2013.
Abstract
Gaussian process (GP) predictors are an important component of many Bayesian approaches to machine learning. However, even a straightforward implementation of Gaussian process regression (GPR) requires O(n2) space and O(n3) time for a data set of n examples. Several approximation methods have been proposed, but there is a lack of understanding of the relative merits of the different approximations, and in what situations they are most useful. We recommend assessing the quality of the predictions obtained as a function of the compute time taken, and comparing to standard baselines (e.g., Subset of Data and FITC). We empirically investigate four different approximation algorithms on four different prediction problems, and make our code available to encourage future comparisons.
[abs]
[pdf][bib]© JMLR 2013. (edit, beta) |