Learn R Programming

LearnSL (version 1.0.0)

act_method: Activation Function

Description

Upon a received input, calculates the output based on the selected activation function

Usage

act_method(method, x)

Value

List with the weights of the inputs.

Arguments

method

Activation function to be used. It must be one of "step", "sine", "tangent", "linear", "relu", "gelu" or "swish".

x

Input value to be used in the activation function.

Author

Víctor Amador Padilla, victor.amador@edu.uah.es

Details

Formulae used:

step

$$f(x) = \begin{cases} 0 & \text{if } x < \text{threshold} \\ 1 & \text{if } x \geq \text{threshold} \end{cases}$$

sine

$$f(x) = \sinh(x)$$

tangent

$$f(x) = \tanh(x)$$

linear

$$x$$

relu

$$f(x) = \begin{cases} x & \text{if } x > 0 \\ 0 & \text{if } x \leq 0 \end{cases}$$

gelu

$$f(x) = \frac{1}{2} \cdot x \cdot \left(1 + \tanh\left(\sqrt{\frac{2}{\pi}} \cdot (x + 0.044715 \cdot x^3)\right)\right)$$

swish

$$f(x) = \frac{x}{1 + e^{-x}}$$

Examples

Run this code
# example code
act_method("step", 0.3)
act_method("gelu", 0.7)

Run the code above in your browser using DataLab