These are objects representing fitted boosting trees.
an object of class BTInit containing the initial fitted value initFit, the initial training.error and the initial validation.error if any.
an object of class BTErrors containing the vectors of errors for each iteration performed (excl. the initialization). More precisely, it contains the training.error,
validation.error if train.fraction<1 and the oob.improvement if bag.fraction < 1.
Moreover, if a cross-validation approach was performed, a vector of cross-validation errors cv.error as a function of boosting iteration is also stored in this object.
an object of class BTIndivFits containing the list of each individual tree fitted at each boosting iteration.
the Tweedie power (and so the distribution) that has been used to perform the algorithm. It will currently always output 1.
a vector containing the names of the explanatory variables.
the name of the target/response variable.
a vector containing the weights used.
the used seed, if any.
if keep.data=TRUE, an object of class BTData containing the training.set and validation.set (can be NULL if not used). These data frames are reduced
to the used variables, that are the response and explanatory variables. Note that in case of cross-validation, even if keep.data=TRUE the folds will not be kept. In fact, only the data
frames related to the original fit (i.e. on the whole training set) will be saved.
an object of class BTParams containing all the (Adaptive) boosting tree parameters. More precisely, it contains the ABT, train.fraction,
shrinkage, interaction.depth, bag.fraction, n.iter, colsample.bytree and tree.control parameter values.
the keep.data parameter value.
the is.verbose parameter value.
the training set fitted values on the score scale using all the n.iter (and initialization) iterations.
the number of cross-validation folds. Set to 1 if no cross-validation performed.
the original call to the BT algorithm.
the model.frame terms argument.
a vector of values identifying to which fold each observation is in. This argument is not present if there is no cross-validation. On the other hand, it corresponds
to folds.id if it was initially defined by the user.
a vector containing the cross-validation fitted values, if a cross-validation was performed. More precisely, for a given observation, the prediction will be furnished by the cv-model
for which this specific observation was out-of-fold. See predict.BTCVFit for more details.
The following components must be included in a legitimate BTFit object.
Gireg Willame gireg.willame@gmail.com
This package is inspired by the gbm3 package. For more details, see https://github.com/gbm-developers/gbm3/.
Boosting Tree Model Object.
M. Denuit, D. Hainaut and J. Trufin (2019). Effective Statistical Learning Methods for Actuaries |: GLMs and Extensions, Springer Actuarial.
M. Denuit, D. Hainaut and J. Trufin (2019). Effective Statistical Learning Methods for Actuaries ||: Tree-Based Methods and Extensions, Springer Actuarial.
M. Denuit, D. Hainaut and J. Trufin (2019). Effective Statistical Learning Methods for Actuaries |||: Neural Networks and Extensions, Springer Actuarial.
M. Denuit, D. Hainaut and J. Trufin (2022). Response versus gradient boosting trees, GLMs and neural networks under Tweedie loss and log-link. Accepted for publication in Scandinavian Actuarial Journal.
M. Denuit, J. Huyghe and J. Trufin (2022). Boosting cost-complexity pruned trees on Tweedie responses: The ABT machine for insurance ratemaking. Paper submitted for publication.
M. Denuit, J. Trufin and T. Verdebout (2022). Boosting on the responses with Tweedie loss functions. Paper submitted for publication.
BT.