Learn R Programming

ggmlR (version 0.6.1)

ggml_geglu: GeGLU (GELU Gated Linear Unit) (Graph)

Description

Creates a graph node for GeGLU operation. GeGLU uses GELU as the activation function on the first half. CRITICAL for models like GPT-NeoX and Falcon.

Usage

ggml_geglu(ctx, a)

Value

Tensor with half the first dimension of input

Arguments

ctx

GGML context

a

Input tensor (first dimension must be even)

Details

Formula: output = GELU(x) * gate

Examples

Run this code
# \donttest{
ctx <- ggml_init(16 * 1024 * 1024)
a <- ggml_new_tensor_2d(ctx, GGML_TYPE_F32, 8, 3)
ggml_set_f32(a, rnorm(24))
r <- ggml_geglu(ctx, a)
graph <- ggml_build_forward_expand(ctx, r)
ggml_graph_compute(ctx, graph)
result <- ggml_get_f32(r)  # Shape: 4x3
ggml_free(ctx)
# }

Run the code above in your browser using DataLab