Most likely heteroscedastic Gaussian process via kernel smoothing

Published in Knowledge-Based Systems, 2025

Uncertainty is an intrinsic aspect of many scientific experiments and stochastic simulations. In these settings, observation noise can vary across the input space, leading to heteroscedasticity. Heteroscedastic Gaussian process (HGP) regression has been widely used as a surrogate model in various applications due to its capability to handle noise-varying problems. However, the computational cost of HGP models remains high. In this paper, we propose a novel approach to reduce this computational burden. Our method follows a post-modelling learning strategy akin to the most likely heteroscedastic Gaussian process (MLHGP) algorithm. Unlike the MLHGP and its variations, which require multiple GP models, our proposed methodology only requires one GP model to fit the main function and uses the trained kernel parameters to estimate the noise level via kernel smoothing regression. Our proposed method is able to reduce the computational complexity from $\mathcal{O}(2\mathcal{N}^3)$ to $\mathcal{O}(\mathcal{N}^3 + \mathcal{N}^2)$ and cut the requirements of training two models to only one. This approach translates to roughly 2$\times$ speed-up during training in our test cases. We also found that the proposed method is able to achieve better and more stable performance metrics. Additionally, in our Bayesian optimisation test case, the result shows that the proposed method outperforms MLHGP in case of a low number of initial observations and remains competitive in the medium and high initial observation settings, all while being faster in every case.

Download the paper here