Part of Advances in Neural Information Processing Systems 15 (NIPS 2002)
Anton Schwaighofer, Volker Tresp
Gaussian process regression allows a simple analytical treatment of ex- act Bayesian inference and has been found to provide good performance, yet scales badly with the number of training data. In this paper we com- pare several approaches towards scaling Gaussian processes regression to large data sets: the subset of representers method, the reduced rank approximation, online Gaussian processes, and the Bayesian commit- tee machine. Furthermore we provide theoretical insight into some of our experimental results. We found that subset of representers methods can give good and particularly fast predictions for data sets with high and medium noise levels. On complex low noise data sets, the Bayesian committee machine achieves significantly better accuracy, yet at a higher computational cost.