Learn R Programming

RBERT (version 0.1.11)

gelu: Gaussian Error Linear Unit

Description

This is a smoother version of the RELU. Original paper: https://arxiv.org/abs/1606.08415

Usage

gelu(x)

Arguments

x

Float Tensor to perform activation on.

Value

`x` with the GELU activation applied.

Examples

Run this code
# NOT RUN {
tfx <- tensorflow::tf$get_variable("none", tensorflow::shape(10L))
gelu(tfx)
# }

Run the code above in your browser using DataLab