These are objects that can be used for modeling, especially in conjunction with the parsnip package.
mtrymtry_long
trees
min_n
sample_size
learn_rate
loss_reduction
tree_depth
prune
Cp
Each object is generated by either new_quant_param
or
new_qual_param
.
An object of class quant_param
(inherits from param
) of length 7.
These objects are pre-made parameter sets that are useful when the model is based on trees or rules.
mtry
and mtry_long
: The number of predictors that will be randomly
sampled at each split when creating the tree models. The latter uses a
log transformation and is helpful when the data set has a large number of
columns. mtry
is used by parsnip's parsnip::rand_forest()
function.
trees
: The number of trees contained in a random forest or boosted
ensemble. In the latter case, this is equal to the number of boosting
iterations. (see parsnip::rand_forest()
and parsnip::boost_tree()
)
functions.
min_n
: The minimum number of data points in a node that are required
for the node to be split further. (parsnip::rand_forest()
and
parsnip::boost_tree()
)
sample_size
: the size of the data set used for modeling within an
iteration of the modeling algorithm, such as stochastic gradient boosting.
(parsnip::boost_tree()
)
learn_rate
: the rate at which the boosting algorithm adapts from
iteration-to-iteration. (parsnip::boost_tree()
)
loss_reduction
: The reduction in the loss function required to split
further. (parsnip::boost_tree()
)
tree_depth
: The maximum depth of the tree (i.e. number of splits).
(parsnip::boost_tree()
)
prune
: a logical for whether a tree or set of rules should be pruned.
Cp
: The cost-complexity parameter in classical CART models.