Search Results:

Showing results 1 to 10 of 23.


Function neuralnet [neuralnet v1.44.2]
keywords
neural
title
Training of neural networks
description
Train neural networks using backpropagation, resilient backpropagation (RPROP) with (Riedmiller, 1994) or without weight backtracking (Riedmiller and Braun, 1993) or the modified globally convergent version (GRPROP) by Anastasiadis et al. (2005). The function allows flexible settings through custom-choice of error and activation function. Furthermore, the calculation of generalized weights (Intrator O. and Intrator N., 1993) is implemented.
Function plot.nn [neuralnet v1.44.2]
keywords
neural
title
Plot method for neural networks
description
plot.nn, a method for the plot generic. It is designed for an inspection of the weights for objects of class nn, typically produced by neuralnet.
Function gwplot [neuralnet v1.44.2]
keywords
neural
title
Plot method for generalized weights
description
gwplot, a method for objects of class nn, typically produced by neuralnet. Plots the generalized weights (Intrator and Intrator, 1993) for one specific covariate and one response variable.
Function neuralnet-package [neuralnet v1.44.2]
keywords
neural
title
Training of Neural Networks
description
Training of neural networks using the backpropagation, resilient backpropagation with (Riedmiller, 1994) or without weight backtracking (Riedmiller, 1993) or the modified globally convergent version by Anastasiadis et al. (2005). The package allows flexible settings through custom-choice of error and activation function. Furthermore, the calculation of generalized weights (Intrator O & Intrator N, 1993) is implemented.
Function prediction [neuralnet v1.44.2]
keywords
neural
title
Summarizes the output of the neural network, the data and the fitted values of glm objects (if available)
description
prediction, a method for objects of class nn, typically produced by neuralnet. In a first step, the dataframe will be amended by a mean response, the mean of all responses corresponding to the same covariate-vector. The calculated data.error is the error function between the original response and the new mean response. In a second step, all duplicate rows will be erased to get a quick overview of the data. To obtain an overview of the results of the neural network and the glm objects, the covariate matrix will be bound to the output of the neural network and the fitted values of the glm object(if available) and will be reduced by all duplicate rows.
Function confidence.interval [neuralnet v1.44.2]
keywords
neural
title
Calculates confidence intervals of the weights
description
confidence.interval, a method for objects of class nn, typically produced by neuralnet. Calculates confidence intervals of the weights (White, 1989) and the network information criteria NIC (Murata et al. 1994). All confidence intervals are calculated under the assumption of a local identification of the given neural network. If this assumption is violated, the results will not be reasonable. Please make also sure that the chosen error function equals the negative log-likelihood function, otherwise the results are not meaningfull, too.
Function avNNet [caret v6.0-84]
keywords
neural
title
Neural Networks Using Model Averaging
description
Aggregate several neural network models
Function pcaNNet [caret v6.0-84]
keywords
neural
title
Neural Networks with a Principal Component Step
description
Run PCA on a dataset, then use it in a neural network model
Function compute.ann [quarrint v1.0.0]
keywords
neural
title
Neural Network-based Interaction Index for a Quarry
description
Given an object of type quarry, a neural network computes the interaction index (low, medium, high or very high).
Function train.ann [quarrint v1.0.0]
keywords
neural
title
Training an Artificial Neural Network for Interaction Prediction.
description
The function trains a neural network to be used with the functions compute.interaction and compute.ann. The neural network can then be used to predict whether the level of interaction between a quarry and the groundwater is low, medium, high or very high. The user can specify: the explanatory variables to be used; the data frame used to train and validate the network; the structure of the hidden layers; the number of repetitions for the neural network training.