RSNNS (version 0.4-9)

som: Create and train a self-organizing map (SOM)

Description

This function creates and trains a self-organizing map (SOM). SOMs are neural networks with one hidden layer. The network structure is similar to LVQ, but the method is unsupervised and uses a notion of neighborhood between the units. The general idea is that the map develops by itself a notion of similarity among the input and represents this as spatial nearness on the map. Every hidden unit represents a prototype. The goal of learning is to distribute the prototypes in the feature space such that the (probability density of the) input is represented well. SOMs are usually built with 1d, 2d quadratic, 2d hexagonal, or 3d neighborhood, so that they can be visualized straightforwardly. The SOM implemented in SNNS has a 2d quadratic neighborhood. As the computation of this function might be slow if many patterns are involved, much of its output is made switchable (see comments on return values).

Usage

som(x, ...)
"som"(x, mapX = 16, mapY = 16, maxit = 100, initFuncParams = c(1, -1), learnFuncParams = c(0.5, mapX/2, 0.8, 0.8, mapX), updateFuncParams = c(0, 0, 1), shufflePatterns = TRUE, calculateMap = TRUE, calculateActMaps = FALSE, calculateSpanningTree = FALSE, saveWinnersPerPattern = FALSE, targets = NULL, ...)

Arguments

x
a matrix with training inputs for the network
...
additional function parameters (currently not used)
mapX
the x dimension of the som
mapY
the y dimension of the som
maxit
maximum of iterations to learn
initFuncParams
the parameters for the initialization function
learnFuncParams
the parameters for the learning function
updateFuncParams
the parameters for the update function
shufflePatterns
should the patterns be shuffled?
calculateMap
should the som be calculated?
calculateActMaps
should the activation maps be calculated?
calculateSpanningTree
should the SNNS kernel algorithm for generating a spanning tree be applied?
saveWinnersPerPattern
should a list with the winners for every pattern be saved?
targets
optional target classes of the patterns

Value

an rsnns object. Depending on which calculation flags are switched on, the som generates some special members: switched on, the som generates some special members:

Details

Internally, this function uses the initialization function Kohonen_Weights_v3.2, the learning function Kohonen, and the update function Kohonen_Order of SNNS.

References

Kohonen, T. (1988), Self-organization and associative memory, Vol. 8, Springer-Verlag.

Zell, A. et al. (1998), 'SNNS Stuttgart Neural Network Simulator User Manual, Version 4.2', IPVR, University of Stuttgart and WSI, University of Tübingen. http://www.ra.cs.uni-tuebingen.de/SNNS/

Zell, A. (1994), Simulation Neuronaler Netze, Addison-Wesley. (in German)

Examples

Run this code
## Not run: demo(som_iris)
## Not run: demo(som_cubeSnnsR)


data(iris)
inputs <- normalizeData(iris[,1:4], "norm")

model <- som(inputs, mapX=16, mapY=16, maxit=500,  
                calculateActMaps=TRUE, targets=iris[,5])

par(mfrow=c(3,3))
for(i in 1:ncol(inputs)) plotActMap(model$componentMaps[[i]], 
                                       col=rev(topo.colors(12)))

plotActMap(model$map, col=rev(heat.colors(12)))
plotActMap(log(model$map+1), col=rev(heat.colors(12)))
persp(1:model$archParams$mapX, 1:model$archParams$mapY, log(model$map+1), 
     theta = 30, phi = 30, expand = 0.5, col = "lightblue")

plotActMap(model$labeledMap)

model$componentMaps
model$labeledUnits
model$map

names(model)

Run the code above in your browser using DataLab