This function calculates the optimal bandwidth matrix (kernel covariance) for a two-dimensional animal tracking dataset, given an autocorrelated movement model (Fleming et al, 2015). This optimal bandwidth can fully take into account all autocorrelation in the data, assuming it is captured by the movement model.
bandwidth(data,CTMM,VMM=NULL,weights=FALSE,fast=TRUE,
dt=NULL,precision=1/2,PC="Markov",verbose=FALSE,trace=FALSE)
2D timeseries telemetry data represented as a telemetry
object.
A ctmm
movement model as from the output of ctmm.fit
.
An optional vertical ctmm
object for 3D bandwidth calculation.
By default, the weights are taken to be uniform. weights=TRUE
will optimize the weights, while a numeric
array will fix the weights.
Use FFT algorithms for weight optimization.
Optional lag bin width for the FFT algorithm.
Fraction of maximum possible digits of precision to target in weight optimization. precision=1/2
results in about 7 decimal digits of precision if the preconditioner is stable.
Preconditioner to use: can be "Markov", "circulant", "IID", or "direct".
Optionally return the optimal weights
, effective sample size DOF.H
, and other information along with the bandwidth matrix H
.
Produce tracing information on the progress of weight optimization.
Returns a bandwidth matrix
object, which is to be the optimal covariance matrix of the individual kernels of the kernel density estimate.
The weights
argument can be used to correct fixed sampling bias with a prescribed numeric
array or to correct temporal sampling bias caused by autocorrelation with weights=TRUE
.
weights=TRUE
will optimize n=length(data$t)
weights via constrained & preconditioned conjugate gradient algorithms.
These algorithms have a few options that should be considered if the data are very irregular.
FFT=TRUE
grids the data with grid width dt
and applies FFT algorithms, for a computational cost as low as \(O(n \log n)\) with only \(O(n)\) function evaluations.
If no dt
is specified, the minimum sampling interval is used, but if the data are very irregular, then dt
may need to be several times smaller to avoid slow down.
A dt
Insufficient to resolve the details of the sampling schedule will cause an excessive, \(O(n)\) number of feasibility assessments, which you can check with the trace=TRUE
argument.
FFT=FALSE
uses exact times and has a computational cost as low as \(O(n^2)\), including \(O(n^2)\) function evaluations. With PC="direct"
this method will produce a result that is exact to within machine precision, but with a computational cost of \(O(n^3)\).
T. F. Chan (1988). An Optimal Circulant Preconditioner for Toeplitz Systems. SIAM Journal on Scientific and Statistical Computing, 9(4), 766-771.
C. H. Fleming and W. F. Fagan and T. Mueller and K. A. Olson and P. Leimgruber and J. M. Calabrese (2015). Rigorous home-range estimation with movement data: A new autocorrelated kernel-density estimator. Ecology, 96(5), 1182-1188.
D. Marcotte. (1996). Fast variogram computation with FFT. Computers and Geosciences 22(10), 1175-1186.
# NOT RUN {
# Load package and data
library(ctmm)
data(buffalo)
cilla <- buffalo[[1]]
# Fit a continuous-velocity model with tau ~ c(10 days, 1 hour)
# see help(variogram.fit)
GUESS <- ctmm(tau=c(10*24*60^2,60^2))
FIT <- ctmm.fit(cilla,GUESS)
# Optimize bandwidth matrix
H <- bandwidth(cilla,FIT,verbose=TRUE)
# }
Run the code above in your browser using DataLab