Compute the gradient numerically of the negative log-constrained-likelihood of a Gaussian Process conditionally to the inequality constraints (Lopez-Lopera et al., 2018).
constrlogLikGrad(
par = model$kernParam$par,
model,
parfixed = rep(FALSE, length(par)),
mcmc.opts = list(probe = "Genz", nb.mcmc = 1000),
estim.varnoise = FALSE
)the values of the covariance parameters.
an object with class lineqGP.
indices of fixed parameters to do not be optimised.
mcmc options. mcmc.opts$probe A character string corresponding
to the estimator for the orthant multinormal probabilities.
Options: "Genz" (Genz, 1992), "ExpT" (Botev, 2017).
If probe == "ExpT", mcmc.opts$nb.mcmc is the number of MCMC
samples used for the estimation.
If true, a noise variance is estimated.
The gradient of the negative log-constrained-likelihood.
As orthant multinormal probabilities don't have explicit expressions,
the gradient is implemented numerically based on nl.grad.
Orthant multinormal probabilities are estimated via (Genz, 1992; Botev, 2017).
Lopez-Lopera, A. F., Bachoc, F., Durrande, N., and Roustant, O. (2018), "Finite-dimensional Gaussian approximation with linear inequality constraints". SIAM/ASA Journal on Uncertainty Quantification, 6(3): 1224-1255. [link]
Bachoc, F., Lagnoux, A., and Lopez-Lopera, A. F. (2018), "Maximum likelihood estimation for Gaussian processes under inequality constraints". ArXiv e-prints [link]
Genz, A. (1992), "Numerical computation of multivariate normal probabilities". Journal of Computational and Graphical Statistics, 1:141-150. [link]
Botev, Z. I. (2017), "The normal law under linear restrictions: simulation and estimation via minimax tilting". Journal of the Royal Statistical Society: Series B (Statistical Methodology), 79(1):125-148. [link]