FSDAM (version 2020.11-18)

fsdam: FS-DAM NLDR

Description

Forward stepwise deep autoencoder-based monotone nonlinear dimension reduction.

Usage

fsdam(dat, opt_numCode = ncol(dat), opt_seed = 1, opt_model = "n", opt_gpu = 0, 
opt_k = 100, opt_nEpochs = 10000, 
opt_constr = c("newpenalization", "constrained", "none"),
 opt_tuneParam = 10, opt_penfun = "mean", opt_ortho = 1, opt_earlystop = "no", 
 verbose = FALSE)

# S3 method for fsdam plot (x, which=c("mse", "history", "decoder.func", "scatterplot"), k=NULL, dim.1=NULL, dim.2=NULL, col.predict=2, ...)

Arguments

dat

data frame.

opt_numCode

number of components to extract

opt_seed

seed for torch

opt_model

n for newpenalization

opt_gpu

zero-based index of gpu to be used among all gpus. If negative, then no gpu is used

opt_k

number of nodes in the coding/decoding layers

opt_nEpochs

number of epochs for training

opt_constr

constraint string

opt_tuneParam

tuning parameter for monotonicity penalty

opt_penfun

penalize sum or mean

opt_ortho

tuning parameter for orthogonality penalty

opt_earlystop

whether to stop early

verbose

x

fsdam object

which

k

the component to plot

dim.1

index of the first variable

dim.2

index of the second variable

col.predict

color of the predicted curve when which = scatterplot

...

plotting arguments

Details

If the torch python package is not available, this function will stop.

To make sure the right python installation is used, run reticulate::use_python("/app/easybuild/software/Python/3.7.4-foss-2016b/bin/python") in R before running this function for the first time.

References

Fong, Y, Xu, J. Multi-Stage Simultaneous Deep Autoencoder-based Monotone (MSS-DAM) Nonlinear Dimensionality Reduction Methods, Journal of Computational and Graphical Statistics, in press.

Examples

Run this code

if (FALSE) {
    
fit=fsdam(hvtn505tier1[1:100,-1], opt_numCode=2, verbose=TRUE)
fit
plot(fit,which="mse")
plot(fit,which="history")

}

Run the code above in your browser using DataLab