neuralnet is used to train neural networks using backpropagation, resilient backpropagation (RPROP) with (Riedmiller, 1994) or without
weight backtracking (Riedmiller and Braun, 1993) or the modified
globally convergent version (GRPROP) by Anastasiadis et al. (2005). The
function allows flexible settings through custom-choice of error
and activation function. Furthermore the calculation of
generalized weights (Intrator O. and Intrator N., 1993) is
implemented.neuralnet(formula, data, hidden = 1, threshold = 0.01, stepmax = 1e+05, rep = 1, startweights = NULL, learningrate.limit = NULL, learningrate.factor = list(minus = 0.5, plus = 1.2), learningrate=NULL, lifesign = "none", lifesign.step = 1000, algorithm = "rprop+", err.fct = "sse", act.fct = "logistic", linear.output = TRUE, exclude = NULL, constant.weights = NULL, likelihood = FALSE)formula.neuralnet returns an object of class nn.
An object of class nn is a list containing at most the following components:data argument.data argument.formula argument.data argument.plot.nn for plotting the neural network.
gwplot for plotting the generalized weights.
compute for computation of a given neural network for given covariate vectors.
confidence.interval for calculation of confidence intervals of the weights.
prediction for a summary of the output of the neural network.AND <- c(rep(0,7),1)
OR <- c(0,rep(1,7))
binary.data <- data.frame(expand.grid(c(0,1), c(0,1), c(0,1)), AND, OR)
print(net <- neuralnet( AND+OR~Var1+Var2+Var3, binary.data, hidden=0, rep=10, err.fct="ce", linear.output=FALSE))
data(infert, package="datasets")
print(net.infert <- neuralnet( case~parity+induced+spontaneous, infert, err.fct="ce", linear.output=FALSE, likelihood=TRUE))Run the code above in your browser using DataLab