xgb.cv
From xgboost v0.3-2
by Tong He
Cross Validation
The cross valudation function of xgboost
Usage
xgb.cv(params = list(), data, nrounds, nfold, label = NULL, showsd = TRUE,
metrics = list(), obj = NULL, feval = NULL, ...)
Arguments
- params
- the list of parameters. Commonly used ones are:
objective
objective function, common ones arereg:linear
linear regressionbinary:logistic
logistic regression for classification
- data
- takes an
xgb.DMatrix
as the input. - nrounds
- the max number of iterations
- nfold
- number of folds used
- label
- option field, when data is Matrix
- showsd
- boolean, whether show standard deviation of cross validation
- metrics,
- list of evaluation metrics to be used in corss validation,
when it is not specified, the evaluation metric is chosen according to objective function.
Possible options are:
error
binary classification error ratermse
- obj
- customized objective function. Returns gradient and second order gradient with given prediction and dtrain,
- feval
- custimized evaluation function. Returns
list(metric='metric-name', value='metric-value')
with given prediction and dtrain, - ...
- other parameters to pass to
params
.
Details
This is the cross validation function for xgboost
Parallelization is automatically enabled if OpenMP is present. Number of threads can also be manually specified via "nthread" parameter.
This function only accepts an xgb.DMatrix
object as the input.
Examples
data(agaricus.train, package='xgboost')
dtrain <- xgb.DMatrix(agaricus.train$data, label = agaricus.train$label)
history <- xgb.cv(data = dtrain, nround=3, nfold = 5, metrics=list("rmse","auc"),
"max.depth"=3, "eta"=1, "objective"="binary:logistic")
Community examples
Looks like there are no examples yet.