Learn R Programming

laGP (version 1.0)

alcGP: Improvement statistics for sequential or local design

Description

Calculate the active learning Cohn (ALC) statistic, mean-squared predictive error (MSPE) or expected Fisher information (EFI) for a Gaussian process (GP) predictor relative to a set of reference locations, towards sequential design or local search for Gaussian process regression

Usage

alcGP(gpi, Xcand, Xref = Xcand, parallel = c("none", "omp", "gpu"), 
      verb = 0)
mspeGP(gpi, Xcand, Xref = Xcand, fi = TRUE, verb = 0)
efiGP(gpi, Xcand)

Arguments

gpi
a C-side GP object identifier (positive integer); e.g., as returned by newGP
Xcand
a matrix or data.frame containing a design of candidate predictive locations at which the ALC (or other) criteria is (are) evaluated. In the context of laGP, these are the p
Xref
a matrix or data.frame containing a design of reference locations for ALC or MSPE. I.e., these are the locations at which the reduction in variance, or mean squared predictive error, are calculated. In the context of <
parallel
a switch indicating if any parallel calculation of the criteria (method) is desired. For parallel = "omp", the package be compiled with OpenMP flags; for parallel = "gpu", the package must be compiled wit
fi
a scalar logical indicating if the expected Fisher information portion of the expression (MSPE is essentially ALC + c(x)*EFI) should be calculated (TRUE) or set to zero (FALSE). This flag is mostly for error c
verb
a positive integer specifying the verbosity level; verb = 0 is quiet, and larger values cause more progress information to be printed to the screen

Value

  • A vector of length nrow(Xcand) is returned filled with values corresponding to the desired statistic

Details

The best way to see how these functions are used in the context of local approximation is to inspect the code in the laGP.R function.

Otherwise they are pretty self explanatory. They evaluate the ALC, MSPE, and EFI quantities outlined in Gramacy & Apley (2013). The ALC is originally due to Seo, et al. (2000)

References

R.B. Gramacy and D.W. Apley (2013). Local Gaussian process approximation for large computer experiments. Preprint available on arXiv:1303.0383; http://arxiv.org/abs/1303.0383

Seo, S., Wallat, M., Graepel, T., and Obermayer, K. (2000). Gaussian Process Regression: Active Data Selection and Test Point Rejection. In Proceedings of the International Joint Conference on Neural Networks, vol. III, 241-246. IEEE.

See Also

laGP, aGP, predGP

Examples

Run this code
## this follows the example in predGP, but only evaluates 
## information statistics documented here

## Simple 2-d test function used in Gramacy & Apley (2013);
## thanks to Lee, Gramacy, Taddy, and others who have used it before
f2d <- function(x, y=NULL)
  {
    if(is.null(y)) {
      if(!is.matrix(x)) x <- matrix(x, ncol=2)
      y <- x[,2]; x <- x[,1]
    }
    g <- function(z)
      return(exp(-(z-1)^2) + exp(-0.8*(z+1)^2) - 0.05*sin(8*(z+0.1)))
    z <- -g(x)*g(y)
  }

## design with N=441
x <- seq(-2, 2, length=11)
X <- as.matrix(expand.grid(x, x))
Z <- f2d(X)

## fit a GP
gpi <- newGP(X, Z, d=0.35, g=1/1000, dK=TRUE)

## predictive grid with NN=400
xx <- seq(-1.9, 1.9, length=20)
XX <- as.matrix(expand.grid(xx, xx))

## predict
alc <- alcGP(gpi, XX)
mspe <- mspeGP(gpi, XX)
efi <- efiGP(gpi, XX)

## visualize the result
par(mfrow=c(1,3))
image(xx, xx, matrix(sqrt(alc), nrow=length(xx)), col=heat.colors(128),
      xlab="x1", ylab="x2", main="sqrt ALC")
image(xx, xx, matrix(sqrt(mspe), nrow=length(xx)), col=heat.colors(128),
      xlab="x1", ylab="x2", main="sqrt MSPE")
image(xx, xx, matrix(log(efi), nrow=length(xx)), col=heat.colors(128),
      xlab="x1", ylab="x2", main="log EFI")

## clean up
deleteGP(gpi)

Run the code above in your browser using DataLab