Function to initialize the weights and biases in a neural network. It uses the Nguyen-Widrow (1990) algorithm.
Usage
initnw(neurons,p,n,npar)
Arguments
Value
A list containing initial values for weights and biases. The first $s$ components of the list contains vectors with the initial values for
the weights and biases of the $k$-th neuron, i.e. $(\omega_k, b_k, \beta_1^{(k)},...,\beta_p^{(k)})'$.
Details
The algorithm is described in Nguyen-Widrow (1990) and in other books, see for example Sivanandam and Sumathi (2005). The algorithm is briefly described below.
1.-Compute the scaling factor$\theta=0.7 p^{1/n}$.
2.- Initialize the weight and biases for each neuron at random, for example generating random numbers from $U(-0.5,0.5)$.3.- For each neuron:
update $(\beta_1^{(k)},...,\beta_p^{(k)})'$,
$$\beta_j^{(k)}=\frac{\theta \beta_j^{(k)}}{\eta_k}, j=1,...,p,$$
Update the bias $(b_k)$ generating a random number from $U(-\theta,\theta)$.
References
Nguyen, D. and Widrow, B. 1990. "Improving the learning speed of 2-layer neural networks by choosing initial values of the adaptive weights",
Proceedings of the IJCNN, vol. 3, pp. 21-26.
Sivanandam, S.N. and Sumathi, S. 2005. Introduction to Neural Networks Using MATLAB 6.0. Ed. McGraw Hill, First edition.