neuralnet
is used to train neural networks using the Resilient Backpropagation with (Riedmiller, 1994) or without
weightbacktracking (Riedmiller and Braun, 1993) or the modified
globally convergent version by Anastasiadis et al. (2005). The
function allows flexible settings through custom-choice of error-
and activation-function. Furthermore the calculation of
generalized weights (Intrator O. and Intrator N., 1993) is
implemented.neuralnet(formula, data, hidden = 1, threshold = c(0.001),
stepmax = 1e+05, rep = 1, weights.mean = 0, weights.variance = 1,
startweights = NULL, learningrate.limit = NULL,
learningrate.factor = list(minus = 0.5, plus = 1.2), lifesign = "none",
lifesign.step = 1000, algorithm = "rprop+", err.fct = "sse",
act.fct = "logistic", linear.output=TRUE, family = NULL)
formula
will be found.neuralnet
returns an object of class nn
.
An object of class nn
is a list containing at most the following components:data argument
.data argument
.formula argument
.data argument
.plot.nn
for plotting of the neural network.
gwplot
for plotting of the generalized weights.
compute
for computation of the calculated network.AND <- c(rep(0,7),1)
OR <- c(0,rep(1,7))
SUM <- c(0,1,1,2,1,2,2,3)
binary.data <- data.frame(expand.grid(c(0,1), c(0,1), c(0,1)), AND, OR, SUM)
print(net <- neuralnet( AND+OR~Var1+Var2+Var3, binary.data, hidden=0, rep=10,
err.fct="ce", linear.output=FALSE))
print(net.sum <- neuralnet( SUM~Var1+Var2+Var3, binary.data, hidden=0,
linear.output=TRUE))
net.sum$predictions
data(infert, package="datasets")
print(net.infert <- neuralnet( case~parity+induced+spontaneous, infert,
err.fct="ce", linear.output=FALSE, family=binomial()))
Run the code above in your browser using DataLab