powered by
Upon a received input, calculates the output based on the selected activation function
act_method(method, x)
List with the weights of the inputs.
Activation function to be used. It must be one of "step", "sine", "tangent", "linear", "relu", "gelu" or "swish".
"step"
"sine"
"tangent"
"linear"
"relu"
"gelu"
"swish"
Input value to be used in the activation function.
Víctor Amador Padilla, victor.amador@edu.uah.es
Formulae used:
$$f(x) = \begin{cases} 0 & \text{if } x < \text{threshold} \\ 1 & \text{if } x \geq \text{threshold} \end{cases}$$
$$f(x) = \sinh(x)$$
$$f(x) = \tanh(x)$$
$$x$$
$$f(x) = \begin{cases} x & \text{if } x > 0 \\ 0 & \text{if } x \leq 0 \end{cases}$$
$$f(x) = \frac{1}{2} \cdot x \cdot \left(1 + \tanh\left(\sqrt{\frac{2}{\pi}} \cdot (x + 0.044715 \cdot x^3)\right)\right)$$
$$f(x) = \frac{x}{1 + e^{-x}}$$
# example code act_method("step", 0.3) act_method("gelu", 0.7)
Run the code above in your browser using DataLab