h.mcv computes the modified
cross-validation bandwidth selector of r'th derivative of
kernel density estimator one-dimensional.h.mcv(x, ...)
## S3 method for class 'default':
h.mcv(x, deriv.order = 0, lower = 0.1 * hos, upper = 2 * hos,
tol = 0.1 * lower, kernel = c("gaussian", "epanechnikov", "triweight",
"tricube", "biweight", "cosine"), ...)hos (Over-smoothing) is calculated internally
from an kernel, see details.optimize."gaussian".x argument.h.mcv modified cross-validation implements for choosing the bandwidth $h$
of a r'th derivative kernel density estimator.
Stute (1992) proposed a so-called modified cross-validation (MCV) in kernel
density estimator. This method can be extended to the estimation of derivative
of a density, the essential idea based on approximated the problematic term
by the aid of the Hajek projection (see Stute 1992). The minimization criterion is defined by:
$$MCV(h;r) = \frac{R\left(K^{(r)}\right)}{nh^{2r+1}} + \frac{(-1)^{r}}{n(n-1)h^{2r+1}}\sum_{i=1}^{n} \sum_{j=1;j \neq i}^{n} \varphi^{(r)} \left(\frac{X_{j}-X_{i}}{h}\right)$$
whit $$\varphi^{(r)}(c) = \left(K^{(r)} \ast K^{(r)} - K^{(2r)} - \frac{\mu_{2}(K)}{2}K^{(2r+2)} \right)(c)$$
and $K^{(r)} \ast K^{(r)} (x)$ is the convolution of the r'th derivative kernel function $K^{(r)}(x)$
(see kernel.conv and kernel.fun); $R\left(K^{(r)}\right) = \int_{R} K^{(r)}(x)^{2} dx$ and $\mu_{2}(K) = \int_{R}x^{2} K(x) dx$.
The range over which to minimize is hos Oversmoothing bandwidth, the default is almost always
satisfactory. See George and Scott (1985), George (1990), Scott (1992, pp 165), Wand and Jones (1995, pp 61).plot.h.mcv.## Derivative order = 0
h.mcv(kurtotic,deriv.order = 0)
## Derivative order = 1
h.mcv(kurtotic,deriv.order = 1)Run the code above in your browser using DataLab