Usage
bst(x, y, cost = 0.5, family = c("gaussian", "hinge", "hinge2", "binom", "expo",
"poisson", "tgaussianDC", "thingeDC", "tbinomDC", "binomdDC", "texpoDC", "tpoissonDC", "huber", "thuberDC"), ctrl = bst_control(), control.tree = list(maxdepth = 1),
learner = c("ls", "sm", "tree"))
"print"(x, ...)
"predict"(object, newdata=NULL, newy=NULL, mstop=NULL,
type=c("response", "all.res", "class", "loss", "error"), ...)
"plot"(x, type = c("step", "norm"),...)
"coef"(object, which=object$ctrl$mstop, ...)
"fpartial"(object, mstop=NULL, newdata=NULL)
Arguments
x
a data frame containing the variables in the model.
y
vector of responses. y must be in {1, -1} for family = "hinge".
cost
price to pay for false positive, 0 < cost < 1; price of false negative is 1-cost.
family
A variety of loss functions.
family = "hinge" for hinge loss and family="gaussian" for squared error loss.
Implementing the negative gradient corresponding
to the loss function to be minimized. For hinge loss, +1/-1 binary responses is used.
control.tree
control parameters of rpart.
learner
a character specifying the component-wise base learner to be used:
ls linear models,
sm smoothing splines,
tree regression trees.
newdata
new data for prediction with the same number of columns as x.
mstop
boosting iteration for prediction.
which
at which boosting mstop to extract coefficients.
...
additional arguments.