FNN (version 1.1.3)

ownn: Optimal Weighted Nearest Neighbor Classification

Description

This function implements Samworth's optimal weighting scheme for k nearest neighbor classification. The performance improvement is greatest when the dimension is 4 as reported in the reference.

Usage

ownn(train, test, cl, testcl=NULL, k = NULL, prob = FALSE,
      algorithm=c("kd_tree", "cover_tree", "brute"))

Arguments

train

matrix or data frame of training set cases.

test

matrix or data frame of test set cases. A vector will be interpreted as a row vector for a single case.

cl

factor of true classifications of training set.

testcl

factor of true classifications of testing set for error rate calculation.

k

number of neighbours considered, chosen by 5-fold cross-validation if not supplied.

prob

if this is true, the proportion of the weights for the winning class are returned as attribute prob.

algorithm

nearest neighbor search algorithm.

Value

a list includes k, predictions by ordinary knn, optimal weighted knn and bagged knn, and accuracies if class labels of test data set are given.

References

Richard J. Samworth (2012), “Optimal Weighted Nearest Neighbor Classifiers,” Annals of Statistics, 40:5, 2733-2763.

See Also

knn and knn in class.

Examples

Run this code
# NOT RUN {
    data(iris3)
    train <- rbind(iris3[1:25,,1], iris3[1:25,,2], iris3[1:25,,3])
    test <- rbind(iris3[26:50,,1], iris3[26:50,,2], iris3[26:50,,3])
    cl <- factor(c(rep("s",25), rep("c",25), rep("v",25)))
    testcl <- factor(c(rep("s",25), rep("c",25), rep("v",25)))
    out <- ownn(train, test, cl, testcl)
    out
# }

Run the code above in your browser using DataLab