train(x, ...)## S3 method for class 'default':
train(x, y,
method = "rf",
...,
metric = ifelse(is.factor(y), "Accuracy", "RMSE"),
maximize = ifelse(metric == "RMSE", FALSE, TRUE),
trControl = trainControl(),
tuneGrid = NULL,
tuneLength = 3)
## S3 method for class 'formula':
train(form, data, ..., subset, na.action, contrasts = NULL)
y ~ x1 + x2 + ...formula are preferentially to be taken.lm, rda, lda, gbm, rf, nnet, multinom, gpls, lvqrandomForest). Errors will occur if values
for tuning parameters are passed here.trainControl. (NOTE: If given, this argument must be named.)createGrid in this createGrid. (NOTE: If given, this argument must be named.)train containing:NULL. The returnResamp argument of trainControl
controls how much of the resampled results are saved.train can be used to tune models by picking the complexity parameters that are associated with the optimal resampling statistics. For particular model, a grid of parameters (if any) is created and the model is trained on slightly different data for each candidate combination of tuning parameters. Across each data set, the performance of held-out samples is calculated and the mean and standard deviation is summarized for each combination. The combination with the optimal resampling statistic is chosen as the final model and the entire training set is used to fit a final model.A variety of models are currently available. The table below enumerates the models and the values of the method argument, as well as the complexity parameters used by train.
method Value Package Tuning Parameter(s)
Recursive partitioning rpart maxdepth
ctree mincriterion
Boosted trees gbm interaction depth,
n.trees, shrinkage
blackboost maxdepth, mstop
ada maxdepth, iter, nu
Boosted regression models glmboost mstop
gamboost mstop
logitboost nIter
Random forests rf mtry
cforest mtry
Bagged trees treebag nnet decay, size
Projection pursuit regression ppr nterms
Partial least squares pls ncomp
Support vector machines (RBF) svmradial sigma, C
Support vector machines (polynomial) svmpoly scale, degree, C
Relevance vector machines (RBF) rvmradial sigma
Relevance vector machines (polynomial) rvmpoly scale, degree
Least squares support vector machines (RBF) lssvmradial sigma
Gaussian processes (RBF) guassprRadial sigma
Gaussian processes (polynomial) guassprPoly scale, degree
Linear least squares lm earth degree, nprune
Bagged MARS bagEarth degree, nprune
M5 rules M5Rules pruned
Elastic net enet lambda, fraction
The Lasso enet fraction
Penalized linear models penalized lambda1, lambda2
Supervised principal components superpc n.components, threshold
Linear discriminant analysis lda slda sddaLDA, sddaQDA multinom decay
Regularized discriminant analysis rda lambda, gamma
Stabilised linear discriminant analysis slda fda degree, nprune
Bagged FDA bagFDA degree, nprune
C4.5 decision trees J48 C
k nearest neighbors knn3 k
Nearest shrunken centroids pam threshold
Naive Bayes nb usekernel
Generalized partial least squares gpls K.prov
Learned vector quantization lvq k
}
By default, the function createGrid is used to define the candidate values of the tuning parameters. The user can also specify their own. To do this, a data fame is created with columns for each tuning parameter in the model. The column names must be the same as those listed in the table above with a leading dot. For example, ncomp would have the column heading .ncomp. This data frame can then be passed to createGrid.
In some cases, models may require control arguments. These can be passed via the three dots argument. Note that some models can specify tuning parameters in the control objects. If specified, these values will be superseded by those given in the createGrid argument.
The vignette entitled "caret Manual -- Model Building" has more details and examples related to this function.
trainControl, createGrid, createFoldsdata(iris)
TrainData <- iris[,1:4]
TrainClasses <- iris[,5]
knnFit1 <- train(TrainData, TrainClasses,
"knn",
tuneLength = 10,
trControl = trainControl(method = "cv"))
knnFit2 <- train(TrainData, TrainClasses,
"knn", tuneLength = 10,
trControl = trainControl(method = "boot"))
library(MASS)
nnetFit <- train(TrainData, TrainClasses,
"nnet",
tuneLength = 2,
trace = FALSE,
maxit = 100)
library(mlbench)
data(BostonHousing)
lmFit <- train(medv ~ . + rm:lstat,
data = BostonHousing,
"lm")
library(rpart)
rpartFit <- train(medv ~ .,
data = BostonHousing,
"rpart",
tuneLength = 9)Run the code above in your browser using DataLab