Gradient boosting for optimizing arbitrary loss functions where regression trees are utilized as base-learners.
BlackBoostModel(
family = NULL,
mstop = 100,
nu = 0.1,
risk = c("inbag", "oobag", "none"),
stopintern = FALSE,
trace = FALSE,
teststat = c("quadratic", "maximum"),
testtype = c("Teststatistic", "Univariate", "Bonferroni", "MonteCarlo"),
mincriterion = 0,
minsplit = 10,
minbucket = 4,
maxdepth = 2,
saveinfo = FALSE,
...
)
optional Family
object. Set
automatically according to the class type of the response variable.
number of initial boosting iterations.
step size or shrinkage parameter between 0 and 1.
method to use in computing the empirical risk for each boosting iteration.
logical inidicating whether the boosting algorithm stops internally when the out-of-bag risk increases at a subsequent iteration.
logical indicating whether status information is printed during the fitting process.
type of the test statistic to be applied for variable selection.
how to compute the distribution of the test statistic.
value of the test statistic or 1 - p-value that must be exceeded in order to implement a split.
minimum sum of weights in a node in order to be considered for splitting.
minimum sum of weights in a terminal node.
maximum depth of the tree.
logical indicating whether to store information about
variable selection in info
slot of each partynode
.
additional arguments to ctree_control
.
MLModel
class object.
binary
, numeric
, Surv
mstop
, maxdepth
Default values for the NULL
arguments and further model details can be
found in the source links below.
# NOT RUN {
library(MASS)
fit(type ~ ., data = Pima.tr, model = BlackBoostModel)
# }
Run the code above in your browser using DataLab