Learn R Programming

dnn (version 0.0.6)

activation: Activation function

Description

Different type of activation functions and the corresponding derivatives

Usage

sigmoid(x)
  elu(x)
  relu(x)
  lrelu(x)
  idu(x)
  dsigmoid(y)
  delu(y)
  drelu(y)
  dlrelu(y)
  dtanh(y)   #activation function tanh(x) is already available in R

Value

An activation function is applied to x and returns a matrix the same size as x. The detail formula for each activation function is:

sigmoid

return 1/(1+exp(-x))

elu

return x for x>0 and exp(x)-1 for x<0

relu

return x for x>0 and 0 for x<0

lrelu

return x for x>0 and 0.1*x for x<0

tanh

return tanh(x)

idu

return (x)

Arguments

x

input of the activation function

y

input of the derivative of the activation function

Author

Bingshu E. Chen

Details

Each function returns either the activation function (e.g. sigmoid, relu) or its derivative (e.g. dsigmoid, drelu).

See Also

bwdNN, fwdNN, dNNmodel, optimizerSGD, optimizerNAG

Examples

Run this code
  # Specify a dnn nodel with user define activation function in layer 2.
  softmax  = function(x) {log(1+exp(x))}    # y = log(1+exp(x))
  dsoftmax = function(y) {sigmoid(y)}       # x = exp(y)/(1+exp(y))
  model = dNNmodel(units=c(8, 6, 1), activation= c('relu', 'softmax', 'sigmoid'), 
          input_shape = c(3))
  print(model)

Run the code above in your browser using DataLab