Learn R Programming

kedd (version 1.0.2)

h.mlcv: Maximum-Likelihood Cross-validation for Bandwidth Selection

Description

The (S3) generic function h.mlcv computes the maximum likelihood cross-validation (Kullback-Leibler information) bandwidth selector of a one-dimensional kernel density estimate.

Usage

h.mlcv(x, ...)
## S3 method for class 'default':
h.mlcv(x, lower = 0.1, upper = 5, tol = 0.1 * lower, 
         kernel = c("gaussian", "epanechnikov", "uniform", "triangular", 
         "triweight", "tricube", "biweight", "cosine"), ...)

Arguments

x
vector of data values.
lower, upper
range over which to maximize. The default is almost always satisfactory.
tol
the convergence tolerance for optimize.
kernel
a character string giving the smoothing kernel to be used, with default "gaussian".
...
further arguments for (non-default) methods.

Value

  • xdata points - same as input.
  • data.namethe deparsed name of the x argument.
  • nthe sample size after elimination of missing values.
  • kernelname of kernel to use
  • hvalue of bandwidth parameter.
  • mlcvthe maximal likelihood CV value.

newcommand

\CRANpkg

href

http://CRAN.R-project.org/package=#1

pkg

#1

Details

h.mlcv maximum-likelihood cross-validation implements for choosing the optimal bandwidth $h$ of kernel density estimator. This method was proposed by Habbema, Hermans, and Van den Broeck (1971) and by Duin (1976). The maximum-likelihood cross-validation (MLCV) function is defined by: $$MLCV(h) = n^{-1} \sum_{i=1}^{n} \log\left[\hat{f}_{h,i}(x)\right]$$ the estimate $\hat{f}_{h,i}(x)$ on the subset ${X_{j}}_{j \neq i}$ denoting the leave-one-out estimator, can be written: $$\hat{f}_{h,i}(X_{i}) = \frac{1}{(n-1) h} \sum_{j \neq i} K \left(\frac{X_{j}-X_{i}}{h}\right)$$ Define that $h_{mlcv}$ as good which approaches the finite maximum of $MLCV(h)$: $$h_{mlcv} = \arg \max_{h} MLCV(h) = \arg \max_{h} \left(n^{-1} \sum_{i=1}^{n} \log\left[\sum_{j \neq i} K \left(\frac{X_{j}-X_{i}}{h}\right)\right]-\log[(n-1)h]\right)$$

References

Habbema, J. D. F., Hermans, J., and Van den Broek, K. (1974) A stepwise discrimination analysis program using density estimation. Compstat 1974: Proceedings in Computational Statistics. Physica Verlag, Vienna. Duin, R. P. W. (1976). On the choice of smoothing parameters of Parzen estimators of probability density functions. IEEE Transactions on Computers, C-25, 1175--1179.

See Also

plot.h.mlcv, see lcv in package locfit.

Examples

Run this code
h.mlcv(bimodal)
h.mlcv(bimodal, kernel ="epanechnikov")

Run the code above in your browser using DataLab