art2
instead.
Learning in an ART network works as follows:
A new input is intended to be classified according
to the prototypes already present in the net. The similarity between the input and
all prototypes is calculated. The most similar prototype is the winner.
If the similarity between the input and the winner is high enough (defined by a
vigilance parameter), the winner is adapted to make it more similar to the input.
If similarity is not high enough, a new prototype is created. So, at most the winner
is adapted, all other prototypes remain unchanged.
art1(x, ...)
"art1"(x, dimX, dimY, f2Units = nrow(x), maxit = 100, initFunc = "ART1_Weights", initFuncParams = c(1, 1), learnFunc = "ART1", learnFuncParams = c(0.9, 0, 0), updateFunc = "ART1_Stable", updateFuncParams = c(0), shufflePatterns = TRUE, ...)
rsnns
object. The fitted.values
member of the object contains a
list of two-dimensional activation patterns.
The default initialization function, ART1_Weights
, is the only one suitable for ART1 networks. It has
two parameters, which are explained in the SNNS User Manual pp.189. A default of 1.0 for both is usually fine.
The only learning function suitable for ART1 is ART1
. Update functions are ART1_Stable
and
ART1_Synchronous
. The difference between the two is that the first one updates until the network is in a
stable state, and the latter one only performs one update step. Both the learning function and the update functions
have one parameter, the vigilance parameter.
In its current implementation, the network has two-dimensional input. The matrix x
contains all
(one dimensional) input patterns. Internally, every one of these patterns
is converted to a two-dimensional pattern using parameters dimX
and dimY
.
The parameter f2Units
controls the number of units in the recognition layer, and therewith the maximal amount of clusters
that are assumed to be present in the input patterns.
A detailed description of the theory and the parameters is available from the SNNS documentation and the other referenced literature.
Grossberg, S. (1988), Adaptive pattern classification and universal recoding. I.: parallel development and coding of neural feature detectors, MIT Press, Cambridge, MA, USA, chapter I, pp. 243--258.
Herrmann, K.-U. (1992), 'ART -- Adaptive Resonance Theory -- Architekturen, Implementierung und Anwendung', Master's thesis, IPVR, University of Stuttgart. (in German)
Zell, A. et al. (1998), 'SNNS Stuttgart Neural Network Simulator User Manual, Version 4.2', IPVR, University of Stuttgart and WSI, University of Tübingen. http://www.ra.cs.uni-tuebingen.de/SNNS/
Zell, A. (1994), Simulation Neuronaler Netze, Addison-Wesley. (in German)
art2
, artmap
## Not run: demo(art1_letters)
## Not run: demo(art1_lettersSnnsR)
data(snnsData)
patterns <- snnsData$art1_letters.pat
inputMaps <- matrixToActMapList(patterns, nrow=7)
par(mfrow=c(3,3))
for (i in 1:9) plotActMap(inputMaps[[i]])
model <- art1(patterns, dimX=7, dimY=5)
encodeClassLabels(model$fitted.values)
Run the code above in your browser using DataLab