Learn R Programming

ggmlR (version 0.6.1)

ggml_silu: SiLU Activation (Graph)

Description

Creates a graph node for SiLU (Sigmoid Linear Unit) activation, also known as Swish. CRITICAL for LLaMA models.

Usage

ggml_silu(ctx, a)

Value

Tensor representing the SiLU operation

Arguments

ctx

GGML context

a

Input tensor

Examples

Run this code
# \donttest{
ctx <- ggml_init(1024 * 1024)
a <- ggml_new_tensor_1d(ctx, GGML_TYPE_F32, 5)
ggml_set_f32(a, c(-2, -1, 0, 1, 2))
result <- ggml_silu(ctx, a)
graph <- ggml_build_forward_expand(ctx, result)
ggml_graph_compute(ctx, graph)
ggml_get_f32(result)
ggml_free(ctx)
# }

Run the code above in your browser using DataLab