Learn R Programming

hann (version 1.2)

hann.R: Method Top-Level Functions

Description

Functions to fit Hopfield artificial neural networks or to access the results.

Usage

hann(xi, sigma, classes, H = NULL, labels = NULL, net = NULL,
     control = control.hann())
# S3 method for hann
print(x, ...)
# S3 method for hann
summary(object, ...)
# S3 method for hann
str(object, ...)
# S3 method for hann
plot(x, y, type = "h", ...)
# S3 method for hann
coef(object, ...)
# S3 method for hann
fitted(object, ...)
# S3 method for hann
labels(object, ...)
# S3 method for hann
predict(object, ...)

Value

hann returns an object of class c("hann", "hann1") or

c("hann", "hann3"); see the links below for their description.

Arguments

xi

a matrix of patterns with K rows and N columns.

sigma

a vector coding the Hopfield network (length N).

classes

the classes of the patterns (vector of length K).

H

the number of neurons in the hidden layer (can be 0).

labels

a vector of labels used for the classes.

net, x, object

an object inheriting class "hann".

control

the control parameters.

y

(unused).

type

the type of plot for vectors of parameters (biases).

...

options passed to other methods.

Details

hann() calls either hann1() or hann3() depending on the value given to the argument H (the number of hidden neurons).

The other functions are (standard) methods for accessing the results.

See Also

hann1, hann3

Examples

Run this code
## function to create 'images' default size is 9x9 pixels,
## with 4 possible shapes ("V"ertical, "H"orizontal, "U"p-diag.
## or "D"own-diag)
rpat <- function(type, nr = 9L, nc = 9L,
                 signal = c(200, 255), noise = c(0, 50))
{
    ij <- switch(type,
                 "V" = cbind(1:nr, ceiling(nc/2)),
                 "H" = cbind(ceiling(nr/2), 1:nc),
                 "U" = cbind(nr:1, 1:nc),
                 "D" = cbind(1:nr, 1:nc))
    x <- matrix(runif(nr * nc, noise[1], noise[2]), nr, nc)
    x[ij] <- runif(nr, signal[1], signal[2])
    round(x)
}

## the 4 types of patterns to simulate:
labs <- c("V", "H", "U", "D")
## repeat them 40 times, this will be used as class-vector:
cl <- rep(labs, each = 40)
## simulate the images:
xi <- t(sapply(cl, rpat))
## binarize the patterns (-1/+1):
xi <- binarize(xi)
## build the sigma vector:
sig <- buildSigma(xi, quiet = TRUE)

## optimize the neural net with 10 hidden neurons:
ctr <- control.hann(quiet = TRUE)
nt <- hann(xi, sig, cl, H = 10, control = ctr)

## convergence depends on the initial parameter values, so it might be
## needed to repeat the previous command a few times so that the next
## one shows only values on the diagonal (which can be reached with
## the default 100 iterations)

table(cl, predict(nt, xi, rawsignal = FALSE))

## now generate 10 new patterns...
new_cl <- rep(labs, each = 10)
new_xi <- binarize(t(sapply(new_cl, rpat)))
## ... and see how well they are predicted:
table(new_cl, predict(nt, new_xi, rawsignal = FALSE))

## visualize the optimized neural net
layout(matrix(1:6, 2, 3, TRUE))
plot(nt)
layout(1)

Run the code above in your browser using DataLab