Optimize a one-layer Hopfield artificial neural network. The structure
of the network is quite simple: a Hopfield network with N input
neurons all connected to C output neurons. The number of parameters (N
and C) is determined by the input data: xi has N columns (which
is also the length of sigma) and the number of unique values of
classes is equal to C.
hann1(xi, sigma, classes, labels = NULL,
net = NULL, control = control.hann())# S3 method for hann1
print(x, details = FALSE, ...)
an object of class c("hann", "hann1") with the following
elements:
a list with one matrix, W, and one vector,
bias.
the Hopfield network.
the hyperparameter of the activation function.
the labels of the classes.
the function call.
the raw signals of the output neurons from the input patterns.
a matrix of patterns with K rows.
a vector coding the Hopfield network.
the classes of the patterns (vector of length K).
a vector of labels used for the classes.
an object inheriting class "hann1".
the control parameters.
a logical value (whether to print the parameter values of the network).
further arguments passed to print.default.
By default, the parameters of the neural network are initialized with random values from a uniform distribution between -1 and 1 (except the biases which are initialized to zero).
If an object inheriting class "hann1" is given to the argument
net, then its parameter values are used to initialize the
parameters of the network.
The main control parameters are given as a list to the control
argument. They are detailed in the page of the function
control.hann().
Hopfield, J. J. (1982) Neural networks and physical systems with emergent collective computational abilities. Proceedings of the National Academy of Sciences, USA, 79, 2554--2558. tools:::Rd_expr_doi("10.1073/pnas.79.8.2554").
Krotov, D. and Hopfield, J. J. (2016) Dense associative memory for pattern recognition. tools:::Rd_expr_doi("10.48550/ARXIV.1606.01164").
buildSigma, hann, predict.hann1