powered by
Create setting for AdaBoost with python
setAdaBoost(nEstimators = 50, learningRate = 1, seed = NULL)
The maximum number of estimators at which boosting is terminated
Learning rate shrinks the contribution of each classifier by learningRate. There is a trade-off between learningRate and nEstimators .
A seed for the model
# NOT RUN { model.adaBoost <- setAdaBoost(size = 4, alpha = 1e-05, seed = NULL) # }
Run the code above in your browser using DataLab