Loading...
Thumbnail Image
Publication

FAST LINEAR ALGEBRA FOR GAUSSIAN PROCESSES

Citations
Altmetric:
Abstract
In machine learning, uncertainty calibration, and prediction interpretability are crucial. Gaussian processes (GPs) are widely recognized for their ability to model uncertainty and interoperability. However, their practical application is often limited by the computational intensity of operations like matrix inversion and determinant computation, which scale cubically with the number of data points. This thesis focuses on developing fast algorithms to tackle this computational challenge for GPs and enhance their scalability for large-scale datasets. <br /><br />The first two chapters focus on the structured kernel interpolation (SKI) framework, which interpolates the kernel matrix using a dense grid in the input space and exploits iterative algorithms to accelerate GPs. First, we present a novel and fast iterative algorithm for performing GP inference within the SKI framework. After an initial linear computation in the number of data points, we show that the remaining computation sequence scales independently of the dataset size. Our method speeds up GP inference for several low-dimensional datasets. <br /><br />Unfortunately, SKI’s scalability diminishes as the grid size exponentially increases with increasing input point dimensions. To mitigate this, we integrate sparse grids with the SKI framework, owing to their accurate interpolation and size growing more slowly than dense grids as the number of dimensions rises. Next, we introduce a novel matrix-vector multiplication algorithm for sparse grid kernel matrices, improving SKI’s scalability to higher dimensions. For example, we can scale GP inference to eleven dimensions with over five million points. <br /><br />The final chapter explores GPs in bandit algorithms for optimizing the ranking of top-k items on platforms like online marketplaces and search engines. We introduce a contextual bandit algorithm using GPs with Kendall kernels, which sidesteps the restrictive assumptions typically required for reward feedback, addressing the challenge of many options. Additionally, we develop a quick algorithm for linear algebraic operations on the kernel matrix for top-k rankings, utilizing a sparse representation of the Kendall kernel. This method reduces inference time, leading to faster bandit algorithms with reduced latency.
Type
Dissertation
Date
2024-05
Publisher
License
Attribution 4.0 International
License
http://creativecommons.org/licenses/by/4.0/
Research Projects
Organizational Units
Journal Issue
Embargo Lift Date
Publisher Version
Embedded videos
Related Item(s)