Adadelta optimizer as described in ADADELTA: An Adaptive Learning Rate Method.
optimizer_adadelta(
lr = 1,
rho = 0.95,
epsilon = NULL,
decay = 0,
clipnorm = NULL,
clipvalue = NULL
)float >= 0. Learning rate.
float >= 0. Decay factor.
float >= 0. Fuzz factor. If NULL, defaults to k_epsilon().
float >= 0. Learning rate decay over each update.
Gradients will be clipped when their L2 norm exceeds this value.
Gradients will be clipped when their absolute value exceeds this value.
Other optimizers:
optimizer_adagrad(),
optimizer_adamax(),
optimizer_adam(),
optimizer_nadam(),
optimizer_rmsprop(),
optimizer_sgd()