Learn R Programming

kknn (version 1.2-1)

train.kknn: Training kknn

Description

Training of kknn method via leave-one-out crossvalidation.

Usage

train.kknn(formula, data, kmax = 11, distance = 2, kernel = "triangular",
	ykernel = NULL, contrasts = c('unordered' = "contr.dummy",
	ordered = "contr.ordinal"), ...)

Arguments

formula
A formula object.
data
Matrix or data frame.
kmax
Maximum number of k.
distance
Parameter of Minkowski distance.
kernel
Kernel to use. Possible choices are "rectangular" (which is standard unweighted knn), "triangular", "epanechnikov" (or beta(2,2)), "biweight" (or beta(3,3)), "triweight" (or beta(4,4)), "cos", "inv", "gaussian" and "optimal".
ykernel
Window width of an y-kernel, especially for prediction of ordinal classes.
contrasts
A vector containing the 'unordered' and 'ordered' contrasts to use.
...
Further arguments passed to or from other methods.

Value

  • train.kknn returns a list-object of class train.kknn including the components
  • MISCLASSMatrix of misclassification errors.
  • MEAN.ABSMatrix of mean absolute errors.
  • MEAN.SQUMatrix of mean squared errors.
  • fitted.valuesList of predictions for all combinations of kernel and k.
  • best.parametersList containing the best parameter value for kernel and k.
  • responseType of response variable, one of continuous, nominal or ordinal.
  • distanceParameter of Minkowski distance.
  • callThe matched call.
  • termsThe 'terms' object used.

References

Hechenbichler K. and Schliep K.P. (2004) Weighted k-Nearest-Neighbor Techniques and Ordinal Classification, Discussion Paper 399, SFB 386, Ludwig-Maximilians University Munich (http://www.stat.uni-muenchen.de/sfb386/papers/dsp/paper399.ps) Hechenbichler K. (2005) Ensemble-Techniken und ordinale Klassifikation, PhD-thesis Samworth, R.J. (2012) Optimal weighted nearest neighbour classifiers. Annals of Statistics, to appear. (avaialble from http://www.statslab.cam.ac.uk/~rjs57/Research.html)

See Also

kknn and simulation

Examples

Run this code
library(kknn)
data(miete)
(train.con <- train.kknn(nmqm ~ wfl + bjkat + zh, data = miete, 
	kmax = 25, kernel = c("rectangular", "triangular", "epanechnikov",
	"gaussian", "rank", "optimal")))
plot(train.con)
(train.ord <- train.kknn(wflkat ~ nm + bjkat + zh, miete, kmax = 25,
 	kernel = c("rectangular", "triangular", "epanechnikov", "gaussian", 
 	"rank", "optimal")))
plot(train.ord)
(train.nom <- train.kknn(zh ~ wfl + bjkat + nmqm, miete, kmax = 25, 
	kernel = c("rectangular", "triangular", "epanechnikov", "gaussian", 
	"rank", "optimal")))
plot(train.nom)
data(glass)
glass <- glass[,-1]
(fit.glass1 <- train.kknn(Type ~ ., glass, kmax = 15, kernel = 
	c("triangular", "rectangular", "epanechnikov", "optimal"), distance = 1))
(fit.glass2 <- train.kknn(Type ~ ., glass, kmax = 15, kernel = 
	c("triangular", "rectangular", "epanechnikov", "optimal"), distance = 2))
plot(fit.glass1)
plot(fit.glass2)

Run the code above in your browser using DataLab