keras (version 2.1.3)

optimizer_adagrad: Adagrad optimizer.

Description

Adagrad optimizer as described in Adaptive Subgradient Methods for OnlineLearning and StochasticOptimization.

Usage

optimizer_adagrad(lr = 0.01, epsilon = 1e-08, decay = 0,
  clipnorm = NULL, clipvalue = NULL)

Arguments

lr

float >= 0. Learning rate.

epsilon

float >= 0. Fuzz factor.

decay

float >= 0. Learning rate decay over each update.

clipnorm

Gradients will be clipped when their L2 norm exceeds this value.

clipvalue

Gradients will be clipped when their absolute value exceeds this value.

See Also

Other optimizers: optimizer_adadelta, optimizer_adamax, optimizer_adam, optimizer_nadam, optimizer_rmsprop, optimizer_sgd