Function for reproducing hold-out method (simulation) and \(k\)-fold cross-validation (application). See vignette.
compare(
target,
source = NULL,
prior = NULL,
z = NULL,
family,
alpha,
scale = "iso",
sign = FALSE,
switch = FALSE,
select = TRUE,
foldid.ext = NULL,
nfolds.ext = 10,
foldid.int = NULL,
nfolds.int = 10,
type.measure = "deviance",
alpha.prior = NULL,
naive = TRUE,
seed = NULL,
cores = 1,
xrnet = FALSE
)list with slot x (feature matrix with n rows and p columns) and slot y (target vector of length n)
list of k lists, each with slot x (feature matrix with m_i rows and p columns) and slot y (target vector of length m_i)
prior coefficients: matrix with \(p\) rows (features) and \(k\) columns (sources of co-data)
prior weights
character "gaussian" (\(y\): real numbers), "binomial" (\(y\): 0s and 1s), or "poisson" (\(y\): non-negative integers);
elastic net mixing parameter (0=ridge, 1=lasso): number between 0 and 1
character "exp" for exponential calibration or "iso" for isotonic calibration
sign discovery procedure: logical (experimental argument)
choose between positive and negative weights for each source: logical
select from sources: logical
external fold identifiers
number of external folds
internal fold identifiers
number of internal folds
character
alpha for source regression
compare with naive transfer learning: logical
random seed
number of cores for parallel computing (requires R package `doMC`)
compare with xrnet: logical
[transreg()]