This class implements deep architectures and provides the ability to train them with a pre training using contrastive divergence and fine tuning with backpropagation, resilient backpropagation and conjugate gradients.
rbmListA list which contains all RBMs for the pre-training.
layersA list with the layer information. In the first field are the weights and in the second field is the unit function.
learnRateBiasesThe learning rate for the bias weights.
fineTuneFunctionContains the function for the fine tuning.
executeFunctionContains the function for executing the network.
executeOutputA list which contains the outputs of every layer after an execution of the network.
cancelBoolean value which indicates if the network training is canceled.
cancelMessageThe message when the execution is canceled.
dropoutInputDropout rate on the input layer.
dropoutHiddenDropout rate on the hidden layers.
dropoutOneMaskPerEpochLogical indicating whether to generate a new dropout mask for each epoch (as opposed to for each batch).
dropoutMasksList of dropout masks, used internally.
dataSetpreTrainParametersA set of parameters keeping track of the state of the DBN in terms of pre-training.
fineTuningParametersA set of parameters keeping track of the state of the DBN in terms of fine-tuning.
The class is inherits all attributes from the class link{Net}. When
creating a new instance with the constructor newDArch
(recommended), the darch-object contained the number of layers -1 restricted
Boltzmann machines (RBM), which are used for the unsupervised
pre training of the network. The RBMs are saved in the
attribute rbmList and can be fetched over the getter method
(getRBMList. The two attributes fineTuneFunction and
executeFunction containing the functions for the fine tuning (default:
backpropagation) and for the execution (default:
runDArch. The training of the network is performed by the two
learning functions preTrainDArch and
fineTuneDArch. The first function trains the network with the
unsupervised method contrastive divergence. The second function used the
function in the attribute fineTuneFunction for the fine tuning. After
an execution of the network, the outputs of every layer can be found in the
attribute executeOutput.