Create a Variational Gaussian Process distribution whose index_points
are
the inputs to the layer. Parameterized by number of inducing points and a
kernel_provider
, which should be a tf.keras.Layer
with an @property that
late-binds variable parameters to a tfp.positive_semidefinite_kernel.PositiveSemidefiniteKernel
instance (this requirement has to do with the way that variables must be created
in a keras model). The mean_fn is an optional argument which, if omitted, will
be automatically configured to be a constant function with trainable variable
output.
layer_variational_gaussian_process(
object,
num_inducing_points,
kernel_provider,
event_shape = 1,
inducing_index_points_initializer = NULL,
unconstrained_observation_noise_variance_initializer = NULL,
mean_fn = NULL,
jitter = 1e-06,
name = NULL
)
a Keras layer
What to compose the new Layer
instance with. Typically a
Sequential model or a Tensor (e.g., as returned by layer_input()
).
The return value depends on object
. If object
is:
missing or NULL
, the Layer
instance is returned.
a Sequential
model, the model with an additional layer is returned.
a Tensor, the output tensor from layer_instance(object)
is returned.
number of inducing points in the Variational Gaussian Process distribution.
a Layer
instance equipped with an @property
, which
yields a PositiveSemidefiniteKernel
instance. The latter is used to parametrize
the constructed Variational Gaussian Process distribution returned by calling
the layer.
the shape of the output of the layer. This translates to a
batch of underlying Variational Gaussian Process distributions. For example,
event_shape = 3
means we are modelling a batch of 3 distributions over functions.
We can think oof this as a distribution over 3-dimensional veector-valued
functions.
a tf.keras.initializer.Initializer
used to initialize the trainable inducing_index_points variables
. Training
VGP's is pretty sensitive to choice of initial inducing index point locations.
A reasonable heuristic is to scatter them near the data, not too close to each
other.
a tf.keras.initializer.Initializer
used to initialize the unconstrained observation noise variable. The observation
noise variance is computed from this variable via the tf.nn.softplus
function.
a callable that maps layer inputs to mean function values. Passed to the mean_fn parameter of Variational Gaussian Process distribution. If omitted, defaults to a constant function with trainable variable value.
a small term added to the diagonal of various kernel matrices for numerical stability.
name to give to this layer and the scope of ops and variables it contains.