--For kernel density estimation, kde computes
$$\hat{f}(\bold{x}; \bold{{\rm H}}) = n^{-1} \sum_{i=1}^n K_{\bold{{\rm H}}} (\bold{x} - \bold{X}_i).$$
There are several varieties of bandwidth matrix selectors
hpi(1-d);Hpi,Hpi.diag(2- to 6-d)
hlscv(1-d);Hlscv,Hlscv.diag(2- to 6-d)
Hbcv,Hbcv.diag(2- to 6-d)
hscv(1-d);Hscv,Hscv.diag(2- to 6-d)
hmise.mixt,hamise.mixt(1-d); andHmise.mixt,Hamise.mixt,Hmise.mixt.diag,Hamise.mixt.diag(2- to 6-d).plot method calls plot.kde. --For kernel discriminant analysis, kda.kde computes density estimates for each the
groups in the training data, and the discriminant surface.
Its plot method is plot.kda.kde. The wrapper function Hkda computes
bandwidth matrices for each group in the training data, by calling the above selectors.
--For kernel density derivative estimation, the main function is kdde
$$\widehat{{\sf D}^{\otimes r}f}(\bold{x}; \bold{{\rm H}}) = n^{-1} \sum_{i=1}^n {\sf D}^{\otimes r}K_{\bold{{\rm H}}} (\bold{x} - \bold{X}_i).$$ Only normal scale selectors currently implemented.
--For kernel functional estimation, kfe computes the $r$-th order integrated density functional
$$\hat{{\bold \psi}}_r (\bold{{\rm H}}) = n^{-2} \sum_{i=1}^n \sum_{j=1}^n {\sf D}^{\otimes r}K_{\bold{{\rm H}}}(\bold{X}_i-\bold{X}_j).$$ The plug-in selector is Hpi.kfe.
--Binned kernel estimation is available for d = 1, 2, 3, 4. This makes kernel estimators
feasible for large samples, though it is only implemented for diagonal bandwidth
matrices.
--For an overview of this package with 2-d density estimation, see
vignette("kde").
Scott, D.W. (1992) Multivariate Density Estimation: Theory, Practice, and Visualization. John Wiley & Sons, New York.
Silverman, B. (1986) Density Estimation for Statistics and Data Analysis. Chapman & Hall/CRC, London.
Simonoff, J. S. (1996) Smoothing Methods in Statistics. Springer-Verlag. New York.
Wand, M.P. & Jones, M.C. (1995) Kernel Smoothing. Chapman & Hall/CRC, London.
sm, KernSmooth