blackboost(formula, data = list(),
tree_controls = ctree_control(
teststat = "max",
testtype = "Teststatistic",
mincriterion = 0,
maxdepth = 2, savesplitstats = FALSE),
...)
"TreeControl"
, which can be
obtained using ctree_control
.
Defines hyper-parameters for the trees which are used as base-learnermboost_fit
,
including weights
, offset
, family
and
control
. For default values see
gbm
. The
main difference is that arbitrary loss functions to be optimized
can be specified via the family
argument to blackboost
whereas
gbm
uses hard-coded loss functions.
Moreover, the base-learners (conditional
inference trees, see ctree
) are a little bit more flexible.The regression fit is a black box prediction machine and thus hardly interpretable.
Partial dependency plots are not yet available; see example section for plotting of additive tree models.
Torsten Hothorn, Kurt Hornik and Achim Zeileis (2006). Unbiased recursive partitioning: A conditional inference framework. Journal of Computational and Graphical Statistics, 15(3), 651--674.
Yoav Freund and Robert E. Schapire (1996), Experiments with a new boosting algorithm. In Machine Learning: Proc. Thirteenth International Conference, 148--156.
Jerome H. Friedman (2001), Greedy function approximation: A gradient boosting machine. The Annals of Statistics, 29, 1189--1232.
Greg Ridgeway (1999), The state of boosting. Computing Science and Statistics, 31, 172--181.
mboost
for the generic boosting function and
glmboost
for boosted linear models and
gamboost
for boosted additive models. See
cvrisk
for cross-validated stopping iteration.
Furthermore see boost_control
, Family
and
methods
### a simple two-dimensional example: cars data
cars.gb <- blackboost(dist ~ speed, data = cars,
control = boost_control(mstop = 50))
cars.gb
### plot fit
plot(dist ~ speed, data = cars)
lines(cars$speed, predict(cars.gb), col = "red")
### set up and plot additive tree model
ctrl <- ctree_control(maxdepth = 3)
viris <- subset(iris, Species != "setosa")
viris$Species <- viris$Species[, drop = TRUE]
imod <- mboost(Species ~ btree(Sepal.Length, tree_controls = ctrl) +
btree(Sepal.Width, tree_controls = ctrl) +
btree(Petal.Length, tree_controls = ctrl) +
btree(Petal.Width, tree_controls = ctrl),
data = viris, family = Binomial())[500]
layout(matrix(1:4, ncol = 2))
plot(imod)
Run the code above in your browser using DataLab