Learn R Programming

ggmlR (version 0.6.1)

ggml_layer_batch_norm: Add Batch Normalization Layer

Description

Applies normalization: RMS-normalizes the input, then scales by gamma and shifts by beta (both learnable). Uses ggml_rms_norm which supports backward pass for training.

Usage

ggml_layer_batch_norm(model, eps = 1e-05, name = NULL, trainable = TRUE)

Value

The model object with the batch_norm layer appended (invisibly).

Arguments

model

A ggml_sequential_model object

eps

Small constant for numerical stability (default 1e-5)

name

Optional character name for the layer.

trainable

Logical; whether the layer weights are updated during training.

Examples

Run this code
# \donttest{
model <- ggml_model_sequential() |>
  ggml_layer_dense(128, input_shape = 784) |>
  ggml_layer_batch_norm() |>
  ggml_layer_dense(10, activation = "softmax")
# }

Run the code above in your browser using DataLab