Utility function to construct optimiser from keras, primarily for internal use.
get_keras_optimizer(
optimizer = "adam",
lr = 0.001,
beta_1 = 0.9,
beta_2 = 0.999,
epsilon = 1e-07,
decay = NULL,
clipnorm = NULL,
clipvalue = NULL,
momentum = 0,
nesterov = FALSE,
rho = 0.95,
global_clipnorm = NULL,
use_ema = FALSE,
ema_momentum = 0.99,
ema_overwrite_frequency = NULL,
jit_compile = TRUE,
initial_accumultator_value = 0.1,
amsgrad = FALSE,
lr_power = -0.5,
l1_regularization_strength = 0,
l2_regularization_strength = 0,
l2_shrinkage_regularization_strength = 0,
beta = 0,
centered = FALSE
)No return value.
(character(1))
Optimizer to construct, see details for those available.
Default is "adam".
(numeric(1))
Learning rate passed to all optimizers.
(numeric(1))
Passed to adamax, adam, and nadam.
(numeric(1))
Passed to adadelta, adagrad, adam, adamax, nadam, rmsprop
(numeric(1))
Passed to all optimizers.
(numeric(1))
Passed to rmsprop and sgd.
(logical(1))
Passed to sgd.
(numeric(1))
Passed to adadelta and rmsprop.
(logical(1))
Passed to all optimizers.
(numeric(1))
Passed to all optimizers.
(numeric(1))
Passed to adagrad and ftrl.
(logical(1))
Passed to adam and sgd.
(numeric(1))
Passed to ftrl.
(logical(1))
Passed to rmsprop.
Implemented optimizers are
"adadelta"
keras::optimizer_adadelta
"adagrad"
keras::optimizer_adagrad
"adam"
keras::optimizer_adam
"adamax"
keras::optimizer_adamax
"ftrl"
keras::optimizer_ftrl
"nadam"
keras::optimizer_nadam
"rmsprop"
keras::optimizer_rmsprop
"sgd"
keras::optimizer_sgd