Calculates the linear neuron output no transfer function
createDataSet,ANY,missing,formula,missing-method
Constructor function for '>DataSet
objects. Sets the number of visible units
Binary sigmoid unit function.
Class for specifying datasets.
getDropoutOneMaskPerEpoch,DArch-method
Return the dropout usage
Fine tuning function for the deep architecture
Returns the cancel value
Adds a field to a layer
Returns the update value for the biases of the hidden units.
Returns a list of RBM
s of the '>DArch
object Returns the number of epochs the '>Net
was trained for Maxout unit function with unit derivatives.
Returns the learning rate for the bias weights of the
'>DArch
object Applies the given dropout mask to the given data row-wise.
Linear unit function.
Returns the dropout rate for the hidden layers
Constructor function for RBM object.
Returns if the weights are saved as ff objects
Class for restricted Boltzmann machines
Fit a deep neural network using a formula and a single data frame or matrix.
getLayerWeights,DArch-method
Returns the weights of a layer from the '>DArch
object preTrainDArch,DArch-method
Pre-trains a '>DArch
network Returns the momentum of the Net
Dropout mask generator function.
Returns the learn rate of the weights.
Pre-trains a '>DArch
network Returns the update value for the weights.
incrementEpochs,Net-method
Increment the number of epochs this '>Net
has been trained
for Returns the biases of the hidden units.
getDropoutMasks,DArch-method
Returns the dropout masks
Constructor function for DArch
objects. Create data set using data, targets, a formula, and possibly an existing data
set.
getExecOutput,DArch-method
Returns the execution output of the layer from the '>DArch
object Abstract class for neural networks.
Saves a RBM network
Returns the learning rate for the hidden biases.
Quadratic error function
Fit deep neural network.
getExecuteFunction,DArch-method
Returns the function for the execution of the '>DArch
network createDataSet,ANY,ANY,missing,DataSet-method
Create new '>DataSet
by filling an existing one with new
data. getLayerFunction,DArch-method
Returns the neuron function of a layer from the '>DArch
object addLayerField,DArch-method
Adds a field to a layer
Returns the batch size of the Net
. Sets the update values for the weights
Returns the dropout rate for the input layer
Increment the number of epochs this '>Net
has been trained
for Sets the biases of the hidden units for the RBM
object Fit a deep neural network.
Returns the field of a layer from the '>DArch
object Sets the cancel message.
Returns a list with the states of the visible units.
Cross entropy error function
Linear unit function with unit derivatives.
Returns a list with the states of the hidden units.
Loads weights and biases for a RBM network from a ffData file.
Set the dropout masks.
Sets the batch size of the Net
. addExecOutput,DArch-method
Adds an execution output for a DArch object
Returns the cancel message
getCancelMessage,DArch-method
Returns the cancel message
Returns the execution output of the layer from the '>DArch
object Returns the dropout masks
Returns the current momentum of the Net
. Returns the update value for the biases of the visible units.
Resets the output list of the DArch
object setInitialMomentum<-,Net-method
Sets the initial momentum of the Net
getDropoutMask,DArch-method
Returns the dropout mask for the given layer
getInitialMomentum,Net-method
Returns the momentum of the Net
Returns the function for generating weight matrices.
Sets the final momentum of the Net
. Returns a layer from the '>DArch
object Set the dropout mask for the given layer.
Returns the biases of the visible units.
Returns the a list of layers from the '>DArch
object Returns the number of epochs the '>Net
was trained for Class for deep architectures
Returns the neuron function of a layer from the '>DArch
object getExecOutputs,DArch-method
Returns the execution output list of the '>DArch
object Minimize a differentiable multivariate function.
Adds a layer to the '>DArch
object Conjugate gradient for a classification network
Returns a layer from the '>DArch
object Loads a RBM network
Trains a RBM
with contrastive divergence Returns the execution output list of the '>DArch
object Returns the weigth cost for the training
Returns the output of the RBM
Returns the final momentum of the Net
. Returns the function for the execution of the '>DArch
network Sets the layers for the network
Returns the weights of a layer from the '>DArch
object Returns a list of RBM
s of the '>DArch
object Generates the RBMs for the pre-training.
Returns the fine tune function for the '>DArch
object Adds an execution output for a DArch object
Resilient backpropagation training for deep architectures.
Returns the weights of the RBM
. Create and train DArch object using a '>DataSet
. Set whether the learning shall be canceled.
Returns the cancel value
Sets the positive phase data for the training
Adds a layer to the '>DArch
object createDataSet,ANY,ANY,missing,missing-method
Create '>DataSet
using data and targets. Sets the unit function of the visible units
getLayerField,DArch-method
Returns the field of a layer from the '>DArch
object getDropoutHiddenLayers,DArch-method
Returns the dropout rate for the hidden layers
getDropoutOneMaskPerEpoch
Return the dropout usage
Removes a layer from the DArch
object Sets the function for generating weight matrices.
Returns the number of visible units of the RBM
Returns the list of statistics for the network
getFineTuneFunction,DArch-method
Returns the fine tune function for the '>DArch
object Provides MNIST data set in the given folder.
setLearnRateBiasVisible<-
Sets the learnig rates of the biases for the visible units
Continuous Tan-Sigmoid unit function.
Sets the unit function of the hidden units
Sets the states of the visible units
Execute the darch
Resets the weights and biases of the RBM
object Sets the weights of the RBM
object Sets the dropout rate for the input layer.
Calculates the neuron output with the sigmoid function
validateDataSet,DataSet-method
Function for updating the weights and biases of an RBM
Conjugate gradient for a autoencoder network
Returns the error function of the Net
. Sets if the weights are saved as ff objects
Mean squared error function
Sets the learning rates of the biases for the hidden units
Sets the biases of the visible units for the RBM
object Sets the update value for the biases of the hidden units
Softmax unit function with unit derivatives.
Sets the dropout rate for the hidden layers.
Returns whether weight normalization is active
Sigmoid unit function with unit derivatives.
Saves weights and biases of a RBM network into a ffData file.
Returns the a list of layers from the '>DArch
object. setDropoutOneMaskPerEpoch<-
Set dropout mask usage
Sets the initial momentum of the Net
Sets the momentum switch of the Net
. Returns the data for the positive phase.
Set whether weight normalization should be performed
Sets a layer with the given index for the network
Sets the states of the hidden units
Sets the error function of the Net
. Sets the fine tuning function for the network
setNormalizeWeights<-,Net-method
Set whether weight normalization should be performed
Sets the update function of the RBM
object Sets the weights of a layer with the given index
getDropoutInputLayer,DArch-method
Returns the dropout rate for the input layer
Forward-propagate data.
setDropoutOneMaskPerEpoch<-,DArch-method
Set dropout mask usage
Sets the learning rate for the biases
Sets the update value for the biases of the visible units
Loads a DArch network
Sets the log level for the Net
. getNormalizeWeights,Net-method
Returns whether weight normalization is active
Saves a DArch network
Continuous Tan-Sigmoid unit function.
Sets the number of hidden units
Function for generating ff files of the MNIST Database
Sets the learning rate for the weights.
Sets the weight costs for the training
Generates a weight matrix.
Sets the list of RBMs
Sigmoid unit function.
Backpropagation learning function
Adds a list of statistics to the network
Sets the function for a layer with the given index
Softmax unit function.
getLearnRateBiases,DArch-method
Returns the learning rate for the bias weights of the
'>DArch
object Returns the learning rate for the visible biases.
Makes start- and end-points for the batches.
Sets the execution function for the network
Sets a field in a layer.
Returns the momentum switch of the Net
. Returns the dropout mask for the given layer
Returns the number of hidden units of the RBM
Resets the weights and biases of the DArch
object Trains a RBM
with contrastive divergence fineTuneDArch,DArch-method
Fine tuning function for the deep architecture
Sets the output of the RBM
object Calculates the neuron output with the sigmoid function
Sets the unit function of the hidden units
Sets a layer with the given index for the network
Sets the initial momentum of the '>Net
Sets the layers for the network
Sets the update value for the biases of the visible units
Set the dropout mask for the given layer.
setNormalizeWeights<-,Net-method
Set whether weight normalization should be performed
Sets the dropout rate for the input layer.
Sets the function for a layer with the given index
Sets the execution function for the network
Sets the learning rate for the biases
Sets a field in a layer.
Sets if the weights are saved as ff objects
Sets the unit function of the visible units
Sets the cancel message.
setDropoutOneMaskPerEpoch<-,DArch-method
Set dropout mask usage
Sets the number of hidden units
Sets the output of the RBM
object Sets the number of visible units
Set whether weight normalization should be performed
Sets the weights of a layer with the given index
Sets the update value for the biases of the hidden units
Set the dropout masks.
Sets the dropout rate for the hidden layers.
Sets the update function of the RBM
object Sets the learning rate for the weights.
Sets the biases of the visible units for the RBM
object Sets the final momentum of the Net
. Sets the states of the hidden units
Sets the fine tuning function for the network
Sets the positive phase data for the training
Sets the batch size of the Net
. Sets the learning rates of the biases for the hidden units
setInitialMomentum<-,Net-method
Sets the initial momentum of the '>Net
Sets the weights of the RBM
object setLearnRateBiasVisible<-
Sets the learnig rates of the biases for the visible units
Sets the update values for the weights
Sets the log level for the Net
. Sets the states of the visible units
setDropoutOneMaskPerEpoch<-
Set dropout mask usage
Sets the function for generating weight matrices.
Sets the momentum switch of the Net
. Set whether the learning shall be canceled.
Sets the error function of the Net
. Sets the weight costs for the training
Sets the list of RBMs
Sets the biases of the hidden units for the RBM
object Adds a list of statistics to the network