Learn R Programming

ggmlR (version 0.6.1)

ggml_reglu: ReGLU (ReLU Gated Linear Unit) (Graph)

Description

Creates a graph node for ReGLU operation. ReGLU uses ReLU as the activation function on the first half.

Usage

ggml_reglu(ctx, a)

Value

Tensor with half the first dimension of input

Arguments

ctx

GGML context

a

Input tensor (first dimension must be even)

Details

Formula: output = ReLU(x) * gate

Examples

Run this code
# \donttest{
ctx <- ggml_init(16 * 1024 * 1024)
a <- ggml_new_tensor_2d(ctx, GGML_TYPE_F32, 8, 3)
ggml_set_f32(a, rnorm(24))
r <- ggml_reglu(ctx, a)
graph <- ggml_build_forward_expand(ctx, r)
ggml_graph_compute(ctx, graph)
result <- ggml_get_f32(r)  # Shape: 4x3
ggml_free(ctx)
# }

Run the code above in your browser using DataLab