Forward stepwise deep autoencoder-based monotone nonlinear dimension reduction.
fsdam(dat, opt_numCode = ncol(dat), opt_seed = 1, opt_model = "n", opt_gpu = 0,
opt_k = 100, opt_nEpochs = 10000,
opt_constr = c("newpenalization", "constrained", "none"),
opt_tuneParam = 10, opt_penfun = "mean", opt_ortho = 1, opt_earlystop = "no",
verbose = FALSE)# S3 method for fsdam
plot (x, which=c("mse", "history", "decoder.func", "scatterplot"),
k=NULL, dim.1=NULL, dim.2=NULL, col.predict=2, ...)
data frame.
number of components to extract
seed for torch
n for newpenalization
zero-based index of gpu to be used among all gpus. If negative, then no gpu is used
number of nodes in the coding/decoding layers
number of epochs for training
constraint string
tuning parameter for monotonicity penalty
penalize sum or mean
tuning parameter for orthogonality penalty
whether to stop early
fsdam object
the component to plot
index of the first variable
index of the second variable
color of the predicted curve when which = scatterplot
plotting arguments
If the torch python package is not available, this function will stop.
To make sure the right python installation is used, run reticulate::use_python("/app/easybuild/software/Python/3.7.4-foss-2016b/bin/python") in R before running this function for the first time.
Fong, Y, Xu, J. Multi-Stage Simultaneous Deep Autoencoder-based Monotone (MSS-DAM) Nonlinear Dimensionality Reduction Methods, Journal of Computational and Graphical Statistics, in press.
if (FALSE) {
fit=fsdam(hvtn505tier1[1:100,-1], opt_numCode=2, verbose=TRUE)
fit
plot(fit,which="mse")
plot(fit,which="history")
}
Run the code above in your browser using DataLab