Learn R Programming

ggmlR (version 0.6.1)

ggml_leaky_relu: Leaky ReLU Activation (Graph)

Description

Creates a graph node for Leaky ReLU activation. LeakyReLU(x) = x if x > 0, else negative_slope * x. Unlike standard ReLU, Leaky ReLU allows a small gradient for negative values.

Usage

ggml_leaky_relu(ctx, a, negative_slope = 0.01, inplace = FALSE)

Value

Tensor representing the Leaky ReLU operation

Arguments

ctx

GGML context

a

Input tensor

negative_slope

Slope for negative values (default: 0.01)

inplace

If TRUE, operation is performed in-place (default: FALSE)

Examples

Run this code
# \donttest{
ctx <- ggml_init(16 * 1024 * 1024)
a <- ggml_new_tensor_1d(ctx, GGML_TYPE_F32, 5)
ggml_set_f32(a, c(-2, -1, 0, 1, 2))
r <- ggml_leaky_relu(ctx, a, negative_slope = 0.1)
graph <- ggml_build_forward_expand(ctx, r)
ggml_graph_compute(ctx, graph)
result <- ggml_get_f32(r)  # [-0.2, -0.1, 0, 1, 2]
ggml_free(ctx)
# }

Run the code above in your browser using DataLab