Learn R Programming

TSLA (version 0.1.2)

plot_TSLA: Plot aggregated structure

Description

Return a tree plot.

Usage

plot_TSLA(TSLA.object, X_2, X_2.org, lambda.index, alpha.index)

Value

A plot

Arguments

TSLA.object

A fit output from TSLA.fit(), or the TSLA.fit object in cv.TSLA().

X_2

Expanded design matrix in matrix form.

X_2.org

Original design matrix in matrix form.

lambda.index

Index of the \(\lambda\) value selected.

alpha.index

Index of the \(\alpha\) value selected. The \(\alpha\) is the tuning parameter for generalized lasso penalty.

Examples

Run this code
# Load the synthetic data
data(ClassificationExample)

tree.org <- ClassificationExample$tree.org   # original tree structure
x2.org <- ClassificationExample$x.org      # original design matrix
x1 <- ClassificationExample$x1
y <- ClassificationExample$y            # response

# Do the tree-guided expansion
expand.data <- getetmat(tree.org, x2.org)
x2 <- expand.data$x.expand              # expanded design matrix
tree.expand <- expand.data$tree.expand  # expanded tree structure

# Do train-test split
idtrain <- 1:200
x1.train <- as.matrix(x1[idtrain, ])
x2.train <- x2[idtrain, ]
y.train <- y[idtrain, ]
x1.test <- as.matrix(x1[-idtrain, ])
x2.test <- x2[-idtrain, ]
y.test <- y[-idtrain, ]

# specify some model parameters
set.seed(100)
control <- list(maxit = 100, mu = 1e-3, tol = 1e-5, verbose = FALSE)
modstr <- list(nlambda = 5,  alpha = seq(0, 1, length.out = 5))
simu.cv <- cv.TSLA(y = y.train, as.matrix(x1[idtrain, ]),
                   X_2 = x2.train,
                   treemat = tree.expand, family = 'logit',
                   penalty = 'CL2', pred.loss = 'AUC',
                   gamma.init = NULL, weight = c(1, 1), nfolds = 5,
                   group.weight = NULL, feature.weight = NULL,
                   control = control, modstr =  modstr)
plot_TSLA(simu.cv$TSLA.fit, x2, x2.org, simu.cv$lambda.min.index, simu.cv$alpha.min.index)



Run the code above in your browser using DataLab