Calculate the initial NN parameter values, such that activation points are within the range between min_init and max_init defined in the un-converter Monolix model file
nn_theta_initializer_mlx(
number,
xmini,
xmaxi,
n_hidden = 5,
theta_scale = 0.1,
pre_fixef = NULL,
time_nn = FALSE,
act = "ReLU",
beta = 20
)Vector of initial NN parameter values for one specific NN
(string) Name of the NN, e.g., “1” for NN1(...)
(numeric) minimal activation point
(numeric) maximal activation point
(numeric) Number of neurons in the hidden layer, default value is 5
(numeric) Scale for input-hidden-weights initialization
(named vector) Vector of pre-defined initial values
(boolean) Definition whether NN is time-dependent (TRUE) or not (FALSE)
(string) Activation function used in the NN. Currently "ReLU" and "Softplus" available.
(numeric) Beta value for the Softplus activation function, only applicable if act="Softplus"; Default to 20.
Dominic Bräm
theta_scale is the scale in which the weights from input to hidden layer are initialized, i.e., 0.1 initializes weights between -0.3 and 0.3; 0.01 initializes weights between -0.03 and 0.03
time_nn defines whether the NN is a time-dependent NN with the restriction that all weights from input to hidden layer are negative