Filter response normalization layer.
layer_filter_response_normalization(
object,
epsilon = 1e-06,
axis = c(1, 2),
beta_initializer = "zeros",
gamma_initializer = "ones",
beta_regularizer = NULL,
gamma_regularizer = NULL,
beta_constraint = NULL,
gamma_constraint = NULL,
learned_epsilon = FALSE,
learned_epsilon_constraint = NULL,
name = NULL
)
Model or layer object
Small positive float value added to variance to avoid dividing by zero.
List of axes that should be normalized. This should represent the spatial dimensions.
Initializer for the beta weight.
Initializer for the gamma weight.
Optional regularizer for the beta weight.
Optional regularizer for the gamma weight.
Optional constraint for the beta weight.
Optional constraint for the gamma weight.
(bool) Whether to add another learnable epsilon parameter or not.
learned_epsilon_constraint
Optional name for the layer
A tensor
Filter Response Normalization (FRN), a normalization method that enables models trained with per-channel normalization to achieve high accuracy. It performs better than all other normalization techniques for small batches and is par with Batch Normalization for bigger batch sizes.