One step in the backpropagation algorithm for a two hidden layers network
fitTeachNet2(data, weights, hidden.structure, learning.rate, f, f_d, decay, m_f, er)
the data set
current weights
vector with first element the number of hidden neurons in the first hidden layer second element for the second hidden layer
rate by which factor for backpropagation gets smaller
activation function
derivative of activation function
value of weight decay
interim value m
error function
returns the new weight after gradient update