powered by
This is a smoother version of the RELU. Original paper: https://arxiv.org/abs/1606.08415
gelu(x)
Float Tensor to perform activation on.
`x` with the GELU activation applied.
# NOT RUN { tfx <- tensorflow::tf$get_variable("none", tensorflow::shape(10L)) gelu(tfx) # }
Run the code above in your browser using DataLab