h.ucv computes the unbiased
(least-squares) cross-validation bandwidth selector
of r'th derivative of kernel density estimator one-dimensional.h.ucv(x, ...)
## S3 method for class 'default':
h.ucv(x, deriv.order = 0, lower = 0.1 * hos, upper = 2 * hos,
tol = 0.1 * lower, kernel = c("gaussian", "epanechnikov", "uniform",
"triangular", "triweight", "tricube", "biweight", "cosine"), ...)hos (Over-smoothing) is calculated internally
from an kernel, see details.optimize."gaussian".x argument.h.ucv unbiased (least-squares) cross-validation implements for choosing the bandwidth $h$
of a r'th derivative kernel density estimator.
Rudemo (1982) and Bowman (1984) proposed a so-called unbiased (least-squares) cross-validation
(UCV) in kernel density estimator. An adaptation of unbiased cross-validation is proposed by
Wolfgang et al. (1990) for bandwidth choice in the r'th derivative of kernel density estimator.
The essential idea of this methods, for the estimation of $f^{(r)}(x)$ ($r$ is derivative order),
is to use the bandwidth $h$ which minimizes the function:
$$UCV(h;r) = \int \left(\hat{f}_{h}^{(r)}(x)\right)^{2} - 2n^{-1}(-1)^{r}\sum_{i=1}^{n} \hat{f}_{h,i}^{(2r)}(X_{i})$$
The bandwidth minimizing this function is:
$$\hat{h}^{(r)}_{ucv} = \arg \min_{h^{(r)}} UCV(h;r)$$
for $r = 0, 1, 2, \dots$
where
$$\int \left(\hat{f}_{h}^{(r)}(x)\right)^{2} = \frac{R\left(K^{(r)}\right)}{nh^{2r+1}} + \frac{(-1)^{r}}{n (n-1) h^{2r+1}} \sum_{i=1}^{n}\sum_{j=1;j \neq i}^{n} K^{(r)} \ast K^{(r)} \left(\frac{X_{j}-X_{i}}{h}\right)$$
and $K^{(r)} \ast K^{(r)} (x)$ is the convolution of the r'th derivative kernel function $K^{(r)}(x)$
(see kernel.conv and kernel.fun).
The estimate $\hat{f}_{h,i}^{(2r)}(x)$ on the subset ${X_{j}}_{j \neq i}$
denoting the leave-one-out estimator, can be written:
$$\hat{f}_{h,i}^{(2r)}(X_{i}) = \frac{1}{(n-1) h^{2r+1}} \sum_{j \neq i} K^{(2r)} \left(\frac{X_{j}-X_{i}}{h}\right)$$
The function $UCV(h;r)$ is unbiased cross-validation in the sense that $E[UCV]=MISE[\hat{f}_{h}^{(r)}(x)]-R(f^{(r)}(x))$
(see, Scott and George 1987). Can be simplified to give the computationally:
$$UCV(h;r) = \frac{R\left(K^{(r)}\right)}{nh^{2r+1}} + \frac{(-1)^{r}}{n (n-1) h^{2r+1}} \sum_{i=1}^{n}\sum_{j=1 ;j \neq i}^{n} \left(K^{(r)} \ast K^{(r)} -2K^{(2r)}\right) \left(\frac{X_{j}-X_{i}}{h}\right)$$
where $R\left(K^{(r)}\right) = \int_{R} K^{(r)}(x)^{2} dx$.
The range over which to minimize is hos Oversmoothing bandwidth, the default is almost always
satisfactory. See George and Scott (1985), George (1990), Scott (1992, pp 165), Wand and Jones (1995, pp 61).plot.h.ucv, see bw.ucv in package ucv in package deriv.order = 0,
hlscv in package 0 <= deriv.order="" <="5,
kdeb in package deriv.order = 0.=>## Derivative order = 0
h.ucv(kurtotic,deriv.order = 0)
## Derivative order = 1
h.ucv(kurtotic,deriv.order = 1)Run the code above in your browser using DataLab