mogpe documentation!¶
This package implements a Mixtures of Gaussian Process Experts (MoGPE) model with a GP-based gating network.
Inference exploits factorisation through sparse GPs and trains a variational lower bound stochastically.
It also provides the building blocks for implementing other Mixtures of Gaussian Process Experts models.
mogpe
uses
GPflow 2.2/TensorFlow 2.4+
for running computations, which allows fast execution on GPUs, and uses Python ≥ 3.8.
It was originally created by Aidan Scannell.
Getting Started¶
To get started please see the Install instructions.
Notes on using mogpe
can be found in Usage and
the examples directory
and notebooks show how the model can be configured and trained.
Details on the implementation can be found in
What’s going on with this code?! and the mogpe API.