632plus bias corrected bootstrap methods based on Random Forest,
Support Vector Machines, Linear Discriminant Analysis and k-Nearest Neighbour methods.## S3 method for class 'data.frame':
classificationError(
formula,
data,
method=c("RF","SVM","LDA","KNN"),
errorType = c("cv", "boot", "six32plus"),
senSpec=TRUE,
negLevLowest=TRUE,
na.action=na.omit,
control=control.errorest(k=NROW(na.action(data)),nboot=100),
...)lhs ~ rhs relating response (class)
variable and the explanatory variables. See lm for
more detail.1 to 4 representing the classification
methods to be used. Can be one or more of "RF" (Random Forest), "SVM"
(Support Vector Machines), 1 to 3 representing the type of
estimators to be used for computing misclassification errors.
Can be one or more of the "cv" (cross-validation), TRUE.TRUE.NA's, defaults to na.omit.errorest.method.classificationError with componentsclassificationError function.length(errorType) by length(method)
matrix of classification errors.2 by length(method) matrix of
sensitivities (first row) and specificities (second row).classificationError does not
check if this is satisfied, but the underlying function
lda produces warnings if this condition is violated.Breiman, L. (2001). Random Forests, Machine Learning 45(1), 5--32.
Chang, Chih-Chung and Lin, Chih-Jen: LIBSVM: a library for Support
Vector Machines,
Ripley, B. D. (1996). Pattern Recognition and Neural Networks.Cambridge: Cambridge University Press.
Efron, B. and Tibshirani, R. (1997). Improvements on Cross-Validation: The .632+ Bootstrap Estimator. Journal of the American Statistical Association 92(438), 548--560.
simDatamydata<-simData(nTrain=30,nBiom=3)
classificationError(formula=class~., data=mydata)Run the code above in your browser using DataLab