
Last chance! 50% off unlimited learning
Sale ends in
Reduce training set for a k-NN classifier. Used after condense
.
reduce.nn(train, ind, class)
Index vector of cases to be retained.
matrix for training set
Initial list of members of the training set (from condense
).
vector of classifications for test set
All the members of the training set are tried in random order. Any which when dropped do not cause any members of the training set to be wrongly classified are dropped.
Gates, G.W. (1972) The reduced nearest neighbor rule. IEEE Trans. Information Theory IT-18, 431--432.
Ripley, B. D. (1996) Pattern Recognition and Neural Networks. Cambridge.
Venables, W. N. and Ripley, B. D. (2002) Modern Applied Statistics with S. Fourth edition. Springer.
condense
, multiedit
train <- rbind(iris3[1:25,,1], iris3[1:25,,2], iris3[1:25,,3])
test <- rbind(iris3[26:50,,1], iris3[26:50,,2], iris3[26:50,,3])
cl <- factor(c(rep("s",25), rep("c",25), rep("v",25)))
keep <- condense(train, cl)
knn(train[keep,], test, cl[keep])
keep2 <- reduce.nn(train, keep, cl)
knn(train[keep2,], test, cl[keep2])
Run the code above in your browser using DataLab