See the vignette "Bayesian_lgcp" for examples of this code in use.The model for the data is as follows:
X(s) ~ Poisson[R(s,t)]
R(s) = C_A lambda(s,t) exp[Z(s,t)beta+Y(s,t)]
Here X(s,t) is the number of events in the cell of the computational grid containing s, R(s,t) is the Poisson rate,
C_A is the cell area, lambda(s,t) is a known offset, Z(s,t) is a vector of measured covariates and Y(s,t) is the
latent Gaussian process on the computational grid. The other parameters in the model are beta, the covariate effects;
and eta=[log(sigma),log(phi),log(theta)], the parameters of the process Y on an appropriately transformed (in this case log) scale.
We recommend the user takes the following steps before running this method:
- Compute approximate values of the parameters, eta, of the process Y using the function minimum.contrast.
These approximate values are used for two main reasons: (1) to help inform the size of the computational grid, since we
will need to use a cell width that enables us to capture the dependence properties of Y and (2) to help inform the
proposal kernel for the MCMC algorithm.
- Choose an appropriate grid on which to perform inference using the function chooseCellwidth; this will partly be determined
by the results of the first stage and partly by the available computational resource available to perform inference.
- Using the function getpolyol, construct the computational grid and polygon overlays, as required. As this can be an expensive step,
we recommend that the user saves this object after it has been
constructed and in future reference to the data, reloads this object, rather than having to re-compute it (provided the
computational grid has not changed).
- Decide on which covariates are to play a part in the analysis and use the lgcp function getZmat to interpolate these
onto the computational grid. Note that having saved the results from the previous step, this is a relatively quick operation,
and allows the user to quickly construct different design matrices, Z, from different candidate models for the data
- If required, set up the population offset using SpatialAtRisk functions (see the vignette "Bayesian_lgcp"); specify the priors
using lgcpPrior; and if desired, the initial values for the MCMC, using the function lgcpInits.
- Run the MCMC algorithm and save the output to disk. We recommend dumping information to disk using the dump2dir function
in the output.control argument because it offers much greater flexibility in terms of MCMC diagnosis and post-processing.
- Perform post-processing analyses including MCMC diagnostic checks and produce summaries of the posterior expectations
we require for presentation. (see the vignette "Bayesian_lgcp" for further details). Functions of use in this step include
traceplots, autocorr, parautocorr, ltar, parsummary, priorpost, postcov, textsummary, expectation, exceedProbs and lgcp:::expectation.lgcpPredict
The user must provide a list of design matrices to use this function. In the interpolation step above, there are three cases to consider
- where Z(s,t) cannot be decomposed, i.e., Z are true spatiotemporal covariates. In this case, each element of the list must
be constructed separately using the function getZmat on the covariates for each time point.
- Z(s,t)beta = Z_1(s)beta_1 + Z_2(t)beta_2: the spatial and temporal effects are separable;
in this case use the function addTemporalCovariates, to aid in the construction of the list.
- Z(s,t)beta = Z(s)beta, in which case the user only needs to perform the interpolation using getZmat
once, each of the elements of the list will then be identical.
- Z(s,t)beta = Z(t)beta in this case we follow the procedure for the separable case above.
For example, if dotw is a temporal covariate we would use formula <- X ~ dotw for the main algorithm, formula.spatial <- X ~ 1 to
interpolate the spatial covariates using getZmat, followed by temporal.formula <- t ~ dotw - 1 using addTemporalCovariates
to construct the list of design matrices, Zmat.