Learn R Programming

laGP (version 1.1-5)

newGP: Create A New GP Object

Description

Build a Gaussian process C-side object based on the X-Z data and parameters provided, and augment those objects with new data

Usage

newGP(X, Z, d, g, dK = FALSE)
newGPsep(X, Z, d, g)
updateGP(gpi, X, Z, verb = 0)
updateGPsep(gpsepi, X, Z, verb = 0)

Arguments

X
a matrix or data.frame containing the full (large) design matrix of input locations
Z
a vector of responses/dependent values with length(Z) = ncol(X)
d
a positive scalar lengthscale parameter for an isotropic Gaussian correlation function (newGP); or a vector for a separable version (newGPsep) with limited capibility
g
a positive scalar nugget parameter
dK
a scalar logical indicating whether or not derivative information should be maintained by the GP object; this is required for calculating MLEs/MAPs via mleGP, jmleGP
verb
a scalar logical indicating the verbosity level. A positive value causes progress statements to be printed to the screen for each update of i in 1:nrow(X)

Value

  • newGP and newGPsep create a unique GP indicator (gpi) referencing a C-side object; updateGP does not return anything, but yields a modified C-side object as a side effect

Details

newGP allocates a new GP object on the C-side and returns its unique integer identifier (gpi), taking time which is cubic on nrow(X); allocated GP objects must (eventually) be destroyed with deleteGP or deleteGPs or memory will leak. The same applies for newGPsep, except deploying a separable correlation with limited feature set; see deleteGPsep and deleteGPseps

updateGP takes gpi identifier as input and augments that GP with new data. A sequence of updates is performed, for each i in 1:nrow(X), each taking time which is quadratic in the number of data points. updateGP also updates any statistics needed in order to quickly search for new local design candidates via laGP.

updateGPsep works similarly, on gpsepi objects, however local design and inference is not supported as this time

References

For standard GP inference, refer to any graduate text, e.g., Rasmussen & Williams Gaussian Processes for Machine Learning. For efficient updates of GPs, see: R.B. Gramacy and D.W. Apley (2014). Local Gaussian process approximation for large computer experiments. Journal of Computational and Graphical Statistics, to appear; preprint available on arXiv:1303.0383; http://arxiv.org/abs/1303.0383

See Also

vignette("laGP"), deleteGP, deleteGPsep, mleGP, mleGPsep, predGP, predGPsep, laGP

Examples

Run this code
## for more examples, see predGP and mleGP docs

## simple sine data
X <- matrix(seq(0,2*pi,length=7), ncol=1)
Z <- sin(X)

## new GP fit
gpi <- newGP(X, Z, 2, 0.000001)

## make predictions
XX <- matrix(seq(-1,2*pi+1, length=499), ncol=ncol(X))
p <- predGP(gpi, XX)

## sample from the predictive distribution
library(mvtnorm)
N <- 100
ZZ <- rmvt(N, p$Sigma, p$df) 
ZZ <- ZZ + t(matrix(rep(p$mean, N), ncol=N))
matplot(XX, t(ZZ), col="gray", lwd=0.5, lty=1, type="l", 
       xlab="x", ylab="f-hat(x)", bty="n")
points(X, Z, pch=19, cex=2)

## update with four more points
X2 <- matrix(c(pi/2, 3*pi/2, -0.5, 2*pi+0.5), ncol=1)
Z2 <- sin(X2)
updateGP(gpi, X2, Z2)

## make a new set of predictions
p2 <- predGP(gpi, XX)
ZZ <- rmvt(N, p2$Sigma, p2$df) 
ZZ <- ZZ + t(matrix(rep(p2$mean, N), ncol=N))
matplot(XX, t(ZZ), col="gray", lwd=0.5, lty=1, type="l", 
       xlab="x", ylab="f-hat(x)", bty="n")
points(X, Z, pch=19, cex=2)
points(X2, Z2, pch=19, cex=2, col=2)

## clean up
deleteGP(gpi)

Run the code above in your browser using DataLab