vcrpart (version 1.0-3)

tvcm-assessment: Model selection utility functions for tvcm objects.

Description

Pruning, cross-validation to find the optimal pruning parameter and computing validation set errors for tvcm objects.

Usage

# S3 method for tvcm
prune(tree, cp = NULL, alpha = NULL, maxstep = NULL,
      terminal = NULL, original = FALSE, ...)

# S3 method for tvcm prunepath(tree, steps = 1L, ...)

# S3 method for tvcm cvloss(object, folds = folds_control(), ...)

folds_control(type = c("kfold", "subsampling", "bootstrap"), K = ifelse(type == "kfold", 5, 100), prob = 0.5, weights = c("case", "freq"), seed = NULL)

# S3 method for cvloss.tvcm plot(x, legend = TRUE, details = TRUE, ...)

# S3 method for tvcm oobloss(object, newdata = NULL, weights = NULL, fun = NULL, ...)

Arguments

object, tree

an object of class tvcm.

cp

numeric scalar. The complexity parameter to be cross-validated resp. the penalty with which the model should be pruned.

alpha

numeric significance level. Represents the stopping parameter for tvcm objects grown with sctest = TRUE, see tvcm_control. A node is splitted when the \(p\) value for any coefficient stability test in that node falls below alpha.

maxstep

integer. The maximum number of steps of the algorithm.

terminal

a list of integer vectors with the ids of the nodes the inner nodes to be set to terminal nodes. The length of the list must be equal the number of partitions.

original

logical scalar. Whether pruning should be based on the trees from partitioning rather than on the current trees.

steps

integer vector. The iteration steps from which information should be extracted.

folds

a list with control arguments as produced by folds_control.

type

character string. The type of sampling scheme to be used to divide the data of the input model in a learning and a validation set.

K

integer scalar. The number of folds.

weights

for folds_control, a character that defines whether the weights of object are case weights or frequencies of cases; for oobloss, a numeric vector of weights corresponding to the rows of newdata.

prob

numeric between 0 and 1. The probability for the "subsampling" cross-validation scheme.

seed

an numeric scalar that defines the seed.

x

an object of class cvloss.tvcm as produced by cvloss.

legend

logical scalar. Whether a legend should be added.

details

logical scalar. Whether the foldwise validation errors should be shown.

newdata

a data.frame of out-of-bag data (including the response variable). See also predict.tvcm.

fun

the loss function for the validation sets. By default, the (possibly weighted) mean of the deviance residuals as defined by the family of the fitted object is applied.

...

other arguments to be passed.

Value

prune returns a tvcm object, folds_control returns a list of parameters for building a cross-validation scheme. cvloss returns an cvloss.tvcm object with at least the following components:

grid

a list with values for cp.

oobloss

a matrix recording the validated loss for each value in grid for each fold.

cp.hat

numeric scalar. The tuning parameter which minimizes the cross-validated error.

folds

the used folds to extract the learning and the validation sets.

oobloss returns a scalar representing the total prediction error for newdata.

Details

tvcglm and tvcm processe tree-size selection by default. The functions could be interesting for advanced users.

The prune function is used to collapse inner nodes of the tree structures by the tuning parameter cp. The aim of pruning by cp is to collapse inner nodes to minimize the cost-complexity criterion

$$error(cp) = error(tree) + cp * complexity(tree)$$

where the training error \(error(tree)\) is defined by lossfun and \(complexity(tree)\) is defined as the total number of coefficients times dfpar plus the total number of splits times dfsplit. The function lossfun and the parameters dfpar and dfsplit are defined by the control argument of tvcm, see also tvcm_control. By default, \(error(tree)\) is minus two times the total likelihood of the model and \(complexity(tree)\) the number of splits. The minimization of \(error(cp)\) is implemented by the following iterative backward-stepwise algorithm

  1. fit all subtree models that collapse one inner node of the current tree model.

  2. compute the per-complexity increase in the training error $$dev = (error(subtree) - error(tree)) / (complexity(tree) - complexity(subtree))$$ for all fitted subtree models

  3. if any dev < cp then set as the tree model the subtree that minimizes dev and repeated 1 to 3, otherwise stop.

The penalty cp is generally unknown and is estimated adaptively from the data. The cvloss function implements the cross-validation method to do this. cvloss repeats for each fold the following steps

  1. fit a new model with tvcm based on the training data of the fold.

  2. prune the new model for increasing cp. Compute for each cp the average validation error.

Doing so yields for each fold a sequence of values for cp and a sequence of average validation errors. These sequences are then combined to a finer grid and the average validation error is averaged correspondingly. From these two sequences we choose the cp value that minimizes the validation error. Notice that the average validation error is computed as the total prediction error of the validation set divided by the sum of validation set weights. See also the argument ooblossfun in tvcm_control and the function oobloss.

The prunepath function can be used to backtrack the pruning algorithm. By default, it shows the results from collapsing inner nodes in the first iteration. The interesting iteration(s) can be selected by the steps argument. The output shows several information on the performances when collapsing inner nodes. The node labels shown in the output refer to the initial tree.

The function folds_control is used to specify the cross-validation scheme, where a random 5-fold cross-validation scheme is used by default. Alternatives are type = "subsampling" (random draws without replacement) and type = "bootstrap" (random draws with replacement). For 2-stage models (with random-effects) fitted by olmm, the subsets are based on subject-wise i.e. first stage sampling. For models where weights represent frequencies of observation units (e.g., data from contingency tables), the option weights = "freq" should be considered. cvloss returns an object for which a print and a plot generic is provided.

oobloss can be used to estimate the total prediction error for validation data (the newdata argument). By default, the loss is defined as the sum of deviance residuals, see the return value dev.resids of family resp. family.olmm. Otherwise, the loss function can be defined manually by the argument fun, see the examples below. In general the sum of deviance residual is equal the sum of the -2 log-likelihood errors. A special case is the gaussian family, where the deviance residuals are computed as \(\sum_{i=1}^N w_i (y_i-\mu)^2\), that is, the deviance residuals ignore the term \(log 2\pi\sigma^2\). Therefore, the sum of deviance residuals for the gaussian model (and possibly others) is not exactly the sum of -2 log-likelihood prediction errors (but shifted by a constant). Another special case are models with random effects. For models based on olmm, the deviance residuals are retrieved from marginal predictions (where random effects are integrated out).

References

Breiman, L., J. H. Friedman, R. A. Olshen and C.J. Stone (1984). Classification and Regression Trees. New York, USA: Wadsworth.

Hastie, T., R. Tibshirani and J. Friedman (2001). The Elements of Statistical Learning (2 ed.). New York, USA: Springer-Verlag.

Buergin, R. and G. Ritschard (2017), Coefficient-Wise Tree-Based Varying Coefficient Regression with vcrpart. Journal of Statistical Software, 80(6), 1--33.

See Also

tvcm

Examples

Run this code
# NOT RUN {
## --------------------------------------------------------- #
## Dummy Example:
##
## Model selection for the 'vcrpart_2' data. The example is
## merely a syntax template.
## --------------------------------------------------------- #

## load the data
data(vcrpart_2)

## fit the model
control <- tvcm_control(maxstep = 2L, minsize = 5L, cv = FALSE)
model <- tvcglm(y ~ vc(z1, z2, by = x1) + vc(z1, by = x2),
                data = vcrpart_2, family = gaussian(),
                control = control, subset = 1:75)

## cross-validate 'dfsplit'
cv <- cvloss(model, folds = folds_control(type = "kfold", K = 2, seed = 1))
cv
plot(cv)

## prune model with estimated 'cp'
model.p <- prune(model, cp = cv$cp.hat)

## backtrack pruning
prunepath(model.p, steps = 1:3)

## out-of-bag error
oobloss(model, newdata = vcrpart_2[76:100,])

## use an alternative loss function
rfun <- function(y, mu, wt) sum(abs(y - mu))
oobloss(model, newdata = vcrpart_2[76:100,], fun = rfun)
# }

Run the code above in your browser using DataCamp Workspace