Learn R Programming

KrigInv (version 1.3.1)

sur_optim_parallel: Parallel sur criterion

Description

Evaluation of the parallel sur criterion for some candidate points. To be used in optimization routines, like in max_sur_parallel. To avoid numerical instabilities, the new points are evaluated only if they are not too close to an existing observation, or if there is some observation noise. The criterion is the integral of the posterior sur uncertainty.

Usage

sur_optim_parallel(x, integration.points, integration.weights = NULL, 
intpoints.oldmean, intpoints.oldsd, 
precalc.data, model, T, 
new.noise.var = NULL, batchsize, current.sur)

Arguments

x

Input vector of size batchsize*d at which one wants to evaluate the criterion. This argument is NOT a matrix.

integration.points

p*d matrix of points for numerical integration in the X space.

integration.weights

Vector of size p corresponding to the weights of these integration points.

intpoints.oldmean

Vector of size p corresponding to the kriging mean at the integration points before adding the batchsize points x to the design of experiments.

intpoints.oldsd

Vector of size p corresponding to the kriging standard deviation at the integration points before adding the batchsize points x to the design of experiments.

precalc.data

List containing useful data to compute quickly the updated kriging variance. This list can be generated using the precomputeUpdateData function.

model

Object of class km (Kriging model).

T

Target value (scalar).

new.noise.var

Optional scalar value of the noise variance for the new observations.

batchsize

Number of points to sample simultaneously. The sampling criterion will return batchsize points at a time for sampling.

current.sur

Current value of the sur criterion (before adding new observations)

Value

Parallel sur value

Details

The first argument x has been chosen to be a vector of size batchsize*d (and not a matrix with batchsize rows and d columns) so that an optimizer like genoud can optimize it easily. For example if d=2, batchsize=3 and x=c(0.1,0.2,0.3,0.4,0.5,0.6), we will evaluate the parallel criterion at the three points (0.1,0.2),(0.3,0.4) and (0.5,0.6). The last argument current.sur is used as a default value for the sur criterion when the new points x are too close to existing observations.

References

Chevalier C., Bect J., Ginsbourger D., Vazquez E., Picheny V., Richet Y. (2011), Fast parallel kriging-based stepwise uncertainty reduction with application to the identification of an excursion set ,http://hal.archives-ouvertes.fr/hal-00641108/

Chevalier C., Ginsbourger D. (2012), Corrected Kriging update formulae for batch-sequential data assimilation ,http://arxiv.org/pdf/1203.6452.pdf

See Also

EGIparallel, max_sur_parallel

Examples

Run this code
# NOT RUN {
#sur_optim_parallel

set.seed(8)
N <- 9 #number of observations
T <- 80 #threshold
testfun <- branin

#a 9 points initial design
design <- data.frame( matrix(runif(2*N),ncol=2) )
response <- testfun(design)

#km object with matern3_2 covariance
#params estimated by ML from the observations
model <- km(formula=~., design = design, 
	response = response,covtype="matern3_2")

###we need to compute some additional arguments:
#integration points, and current kriging means and variances at these points
integcontrol <- list(n.points=50,distrib="sur",init.distrib="MC")
obj <- integration_design(integcontrol=integcontrol,
lower=c(0,0),upper=c(1,1),model=model,T=T)

integration.points <- obj$integration.points
integration.weights <- obj$integration.weights
pred <- predict_nobias_km(object=model,newdata=integration.points,
type="UK",se.compute=TRUE)
intpoints.oldmean <- pred$mean ; intpoints.oldsd<-pred$sd

#another precomputation
precalc.data <- precomputeUpdateData(model,integration.points)

batchsize <- 4
x <- c(0.1,0.2,0.3,0.4,0.5,0.6,0.7,0.8)
#one evaluation of the sur_optim_parallel criterion
#we calculate the expectation of the future "sur" uncertainty 
#when 4 points are added to the doe
#the 4 points are (0.1,0.2) , (0.3,0.4), (0.5,0.6), (0.7,0.8)
sur_optim_parallel(x=x,integration.points=integration.points,
          integration.weights=integration.weights,
          intpoints.oldmean=intpoints.oldmean,intpoints.oldsd=intpoints.oldsd,
          precalc.data=precalc.data,T=T,model=model,
          batchsize=batchsize,current.sur=Inf)


#the function max_sur_parallel will help to find the optimum: 
#ie: the batch of 4 minimizing the expectation of the future uncertainty
# }

Run the code above in your browser using DataLab