Calculates a gradient boosting (gbm) object with a fixed number of trees. The optimal number of trees can be identified using gbm.step or some other procedure. Mostly used as a utility function, e.g., when being called by gbm.simplify. It takes as input a dataset and arguments selecting x and y variables, learning rate and tree complexity.
gbm.fixed(data, gbm.x, gbm.y, tree.complexity = 1, site.weights = rep(1, nrow(data)), verbose = TRUE, learning.rate = 0.001, n.trees = 2000, bag.fraction = 0.5, family = "bernoulli", keep.data = FALSE, var.monotone = rep(0, length(gbm.x)))
indices of the predictors in the input dataframe
index of the response in the input dataframe
the tree depth - sometimes referred to as interaction depth
by default set equal
to control reporting
controls speed of the gradient descent
default number of trees
varies random sample size for each new tree
can be any of "bernoulli", "poisson", "gaussian", or "laplace"
TRUE, original data is kept
constrain to positive (1) or negative monontone (-1)
object of class gbm
Elith, J., J.R. Leathwick and T. Hastie, 2009. A working guide to boosted regression trees. Journal of Animal Ecology 77: 802-81