Learn R Programming

sparselink (version 1.0.0)

methods: Available methods

Description

Wrapper functions of available methods for related problems.

Usage

wrap_empty(x, y, family, alpha = 1)

wrap_separate(x, y, family, alpha = 1, lambda = NULL)

# S3 method for wrap_separate predict(object, newx, ...)

# S3 method for wrap_separate coef(object, ...)

wrap_common(x, y, family, alpha = 1)

# S3 method for wrap_common predict(object, newx, ...)

# S3 method for wrap_common coef(object, ...)

wrap_mgaussian(x, y, family = "gaussian", alpha = 1)

# S3 method for wrap_mgaussian predict(object, newx, ...)

# S3 method for wrap_mgaussian coef(object, ...)

wrap_spls(x, y, family = "gaussian", alpha = 1, nfolds = 10)

# S3 method for wrap_spls predict(object, newx, ...)

# S3 method for wrap_spls coef(object, ...)

wrap_glmtrans(x, y, family = "gaussian", alpha = 1)

# S3 method for wrap_glmtrans predict(object, newx, ...)

# S3 method for wrap_glmtrans coef(object, ...)

wrap_xrnet( x, y, alpha.init = 0.95, alpha = 1, nfolds = 10, family = "gaussian" )

# S3 method for wrap_xrnet predict(object, newx, ...)

# S3 method for wrap_xrnet coef(object, ...)

Value

The wrapper functions wrap_empty, wrap_separate, wrap_common, wrap_mgaussian, wrap_spls, wrap_glmtrans, and wrap_xrnet return fitted models, and the generic functions coef and predict

return coefficients or predicted values in a standardised format.

Arguments

x

feature matrix (multi-task learning) or list of \(q\) feature matrices (transfer learning)

y

response matrix (multi-task learning) or list of \(q\) response vectors (transfer learning)

family

character vector with 1 or \(q\) entries, possible values are "gaussian" and sometimes "binomial" or other

alpha

elastic net mixing parameter: number between 0 and 1

lambda

sequence of regularisation parameters

object

output from multi-task learning or transfer learning method

newx

feature matrix (MTL) or list of feature matrices (TL) of testing samples

...

(not applicable)

nfolds

number of cross-validation folds: positive integer

alpha.init

elastic net mixing parameter for initial models: number between 0 and 1

Functions

  • wrap_empty(): intercept-only model (MTL and TL)

  • wrap_separate(): separate model for each problem (MTL and TL)

  • wrap_common(): common model for all problems (TL)

  • wrap_mgaussian(): multivariate Gaussian regression (MTL)

  • wrap_spls(): sparse partial least squares (MTL)

  • wrap_glmtrans(): transfer generalised linear model (TL)

  • wrap_xrnet(): hierarchical regression (TL)

References

Noah Simon, Jerome H. Friedman, and Trevor Hastie (2013). arXiv (Preprint). tools:::Rd_expr_doi("10.48550/arXiv.1311.6529"). (cv.glmnet)

Hyonho Chun and Sündüz Keleş (2010). "Sparse Partial Least Squares Regression for Simultaneous Dimension Reduction and Variable Selection". Journal of the Royal Statistical Society Series B: Statistical Methodology 72(1);3–25. tools:::Rd_expr_doi("10.1111/j.1467-9868.2009.00723.x"). (spls)

Ye Tian and Yang Feng (2022). "Transfer learning under high-dimensional generalized linear models". Journal of the American Statistical Association 118(544):2684-2697. tools:::Rd_expr_doi("10.1080/01621459.2022.2071278"). (glmtrans)

Garrett M. Weaver and Juan Pablo Lewinger (2019). "xrnet: Hierarchical Regularized Regression to Incorporate External Data". Journal of Open Source Software 4(44):1761. tools:::Rd_expr_doi("10.21105/joss.01761"). (xrnet)

See Also

See original functions cv.glmnet (with argument family="mgaussian") spls, glmtrans, and xrnet.

Examples

Run this code
#--- multi-task learning ---
n_train <- 100
n_test <- 10
p <- 50
q <- 3
family <- "gaussian"
x <- matrix(data=rnorm(n=n_train*p),nrow=n_train,ncol=p)
newx <- matrix(data=rnorm(n=n_test*p),nrow=n_test,ncol=p)
y <- matrix(data=rnorm(n_train*q),nrow=n_train,ncol=q)
object <- wrap_empty(x=x,y=y,family=family)
model <- "empty" # try "empty", "separate", "mgaussian" or "spls"
if(model=="empty"){
  object <- wrap_empty(x=x,y=y,family=family)
} else if(model=="separate"){
  object <- wrap_separate(x=x,y=y,family=family)
} else if(model=="mgaussian"){
  object <- wrap_mgaussian(x=x,y=y,family=family)
} else if(model=="spls"){
  object <- wrap_spls(x=x,y=y,family=family)
}
coef(object)
predict(object,newx=newx)

#--- transfer learning ---
n_train <- c(100,50)
n_test <- c(10,10)
p <- 50
x <- lapply(X=n_train,function(n) matrix(data=stats::rnorm(n*p),nrow=n,ncol=p))
newx <- lapply(X=n_test,function(n) matrix(data=stats::rnorm(n*p),nrow=n,ncol=p))
y <- lapply(X=n_train,function(n) stats::rnorm(n))
family <- "gaussian"
model <- "empty" # try "empty", "separate", "common", "glmtrans", or "xrnet"
if(model=="empty"){
 object <- wrap_empty(x=x,y=y,family=family)
} else if(model=="separate"){
 object <- wrap_separate(x=x,y=y,family=family)
} else if(model=="common"){
 object <- wrap_common(x=x,y=y,family=family)
} else if(model=="glmtrans"){
 object <- wrap_glmtrans(x=x,y=y,family=family)
} else if(model=="xrnet"){
 object <- wrap_xrnet(x=x,y=y,family=family)
}
coef(object)
predict(object,newx=newx)

Run the code above in your browser using DataLab