Create setting for CIReNN model
setCIReNN(numberOfRNNLayer = c(1), units = c(128, 64),
recurrentDropout = c(0.2), layerDropout = c(0.2), lr = c(1e-04),
decay = c(1e-05), outcomeWeight = c(1), batchSize = c(100),
epochs = c(100), earlyStoppingMinDelta = c(1e-04),
earlyStoppingPatience = c(10), useVae = T,
vaeDataSamplingProportion = 0.1, vaeValidationSplit = 0.2,
vaeBatchSize = 100L, vaeLatentDim = 10L, vaeIntermediateDim = 256L,
vaeEpoch = 100L, vaeEpislonStd = 1, seed = NULL)
The number of RNN layer, only 1, 2, or 3 layers available now. eg. 1, c(1,2), c(1,2,3)
The number of units of RNN layer - as a list of vectors
The reccurrent dropout rate (regularisation)
The layer dropout rate (regularisation)
Learning rate
Learning rate decay over each update.
The weight of the outcome class in the loss function
The number of data points to use per training batch
Number of times to iterate over dataset
minimum change in the monitored quantity to qualify as an improvement for early stopping, i.e. an absolute change of less than min_delta in loss of validation data, will count as no improvement.
Number of epochs with no improvement after which training will be stopped.
logical (either TRUE or FALSE) value for using Variational AutoEncoder before RNN
Data sampling proportion for VAE
Validation split proportion for VAE
batch size for VAE
Number of latent dimesion for VAE
Number of intermediate dimesion for VAE
Number of times to interate over dataset for VAE
Epsilon
Random seed used by deep learning model
# NOT RUN {
model.CIReNN <- setCIReNN()
# }
Run the code above in your browser using DataLab