This function builds and trains a GP emulator.
gp(
X,
Y,
name = "sexp",
lengthscale = rep(0.1, ncol(X)),
bounds = NULL,
prior = "ref",
nugget_est = FALSE,
nugget = ifelse(nugget_est, 0.01, 1e-08),
scale_est = TRUE,
scale = 1,
training = TRUE,
verb = TRUE,
check_rep = TRUE,
vecchia = FALSE,
M = 25,
ord = NULL,
id = NULL
)An S3 class named gp that contains five slots:
id: A number or character string assigned through the id argument.
data: a list that contains two elements: X and Y which are the training input and output data respectively.
specs: a list that contains seven elements:
kernel: the type of the kernel function used. Either "sexp" for squared exponential kernel or "matern2.5" for Matérn-2.5 kernel.
lengthscales: a vector of lengthscales in the kernel function.
scale: the variance value in the kernel function.
nugget: the nugget value in the kernel function.
vecchia: whether the Vecchia approximation is used for the GP emulator training.
M: the size of the conditioning set for the Vecchia approximation in the GP emulator training.
constructor_obj: a 'python' object that stores the information of the constructed GP emulator.
container_obj: a 'python' object that stores the information for the linked emulation.
emulator_obj: a 'python' object that stores the information for the predictions from the GP emulator.
The returned gp object can be used by
predict() for GP predictions.
validate() for LOO and OOS validations.
plot() for validation plots.
lgp() for linked (D)GP emulator constructions.
summary() to summarize the trained GP emulator.
write() to save the GP emulator to a .pkl file.
design() for sequential designs.
update() to update the GP emulator with new inputs and outputs.
alm(), mice(), and vigf() to locate next design points.
a matrix where each row is an input data point and each column is an input dimension.
a matrix with only one column and each row being an output data point.
kernel function to be used. Either "sexp" for squared exponential kernel or
"matern2.5" for Matérn-2.5 kernel. Defaults to "sexp".
initial values of lengthscales in the kernel function. It can be a single numeric value or a vector of length ncol(X):
if it is a single numeric value, it is assumed that kernel functions across input dimensions share the same lengthscale;
if it is a vector, it is assumed that kernel functions across input dimensions have different lengthscales.
Defaults to a vector of 0.1.
the lower and upper bounds of lengthscales in the kernel function. It is a vector of length two where the first element is
the lower bound and the second element is the upper bound. The bounds will be applied to all lengthscales in the kernel function. Defaults
to NULL where no bounds are specified for the lengthscales.
prior to be used for Maximum a Posterior for lengthscales and nugget of the GP: gamma prior ("ga"), inverse gamma prior ("inv_ga"),
or jointly robust prior ("ref"). Defaults to "ref". See the reference below for the jointly
robust prior.
a bool indicating if the nugget term is to be estimated:
FALSE: the nugget term is fixed to nugget.
TRUE: the nugget term will be estimated.
Defaults to FALSE.
the initial nugget value. If nugget_est = FALSE, the assigned value is fixed during the training.
Set nugget to a small value (e.g., 1e-8) and the corresponding bool in nugget_est to FALSE for deterministic computer models where the emulator
should interpolate the training data points. Set nugget to a larger value and the corresponding bool in nugget_est to TRUE for stochastic
emulation where the computer model outputs are assumed to follow a homogeneous Gaussian distribution. Defaults to 1e-8 if nugget_est = FALSE and
0.01 if nugget_est = TRUE.
a bool indicating if the variance is to be estimated:
FALSE: the variance is fixed to scale.
TRUE: the variance term will be estimated.
Defaults to TRUE.
the initial variance value. If scale_est = FALSE, the assigned value is fixed during the training.
Defaults to 1.
a bool indicating if the initialized GP emulator will be trained.
When set to FALSE, gp() returns an untrained GP emulator, to which one can apply summary() to inspect its specification or apply predict() to check its emulation performance before the training. Defaults to TRUE.
a bool indicating if the trace information on GP emulator construction and training will be printed during function execution.
Defaults to TRUE.
a bool indicating whether to check for repetitions in the dataset, i.e., if one input
position has multiple outputs. Defaults to
TRUE.
a bool indicating whether to use Vecchia approximation for large-scale GP emulator construction and prediction. Defaults to FALSE.
The Vecchia approximation implemented for the GP emulation largely follows Katzfuss et al. (2022). See reference below.
the size of the conditioning set for the Vecchia approximation in the GP emulator training. Defaults to 25.
an R function that returns the ordering of the input to the GP emulator for the Vecchia approximation. The function must satisfy the following basic rules:
the first argument represents the input scaled by the lengthscales.
the output of the function is a vector of indices that gives the ordering of the input to the GP emulator.
If ord = NULL, the default random ordering is used. Defaults to NULL.
an ID to be assigned to the GP emulator. If an ID is not provided (i.e., id = NULL), a UUID (Universally Unique Identifier) will be automatically generated
and assigned to the emulator. Default to NULL.
See further examples and tutorials at https://mingdeyu.github.io/dgpsi-R/.
Gu, M. (2019). Jointly robust prior for Gaussian stochastic process in emulation, calibration and variable selection. Bayesian Analysis, 14(3), 857-885.
Katzfuss, M., Guinness, J., & Lawrence, E. (2022). Scaled Vecchia approximation for fast computer-model emulation. SIAM/ASA Journal on Uncertainty Quantification, 10(2), 537-554.
if (FALSE) {
# load the package and the Python env
library(dgpsi)
# construct a step function
f <- function(x) {
if (x < 0.5) return(-1)
if (x >= 0.5) return(1)
}
# generate training data
X <- seq(0, 1, length = 10)
Y <- sapply(X, f)
# training
m <- gp(X, Y)
# summarizing
summary(m)
# LOO cross validation
m <- validate(m)
plot(m)
# prediction
test_x <- seq(0, 1, length = 200)
m <- predict(m, x = test_x)
# OOS validation
validate_x <- sample(test_x, 10)
validate_y <- sapply(validate_x, f)
plot(m, validate_x, validate_y)
# write and read the constructed emulator
write(m, 'step_gp')
m <- read('step_gp')
}
Run the code above in your browser using DataLab