One step in the backpropagation algorithm for a one hidden layer network
fitTeachNet1(data, weights, hidden.structure, learning.rate, f, f_d, decay, m_f, er)
the data set
current weights
the number of neurons in the hidden layer
rate by which factor for backpropagation gets smaller
activation function
derivative of activation function
value of weight decay
interim value m
error function
returns new the weight after gradient update