darch (version 0.12.0)

backpropagation: Backpropagation learning function

Description

This function provides the backpropagation algorithm for deep architectures.

Usage

backpropagation(darch, trainData, targetData,
  bp.learnRate = getParameter(".bp.learnRate", rep(1, times =
  length(darch@layers))),
  bp.learnRateScale = getParameter(".bp.learnRateScale"),
  nesterovMomentum = getParameter(".darch.nesterovMomentum"),
  dropout = getParameter(".darch.dropout", rep(0, times = length(darch@layers)
  + 1), darch), dropConnect = getParameter(".darch.dropout.dropConnect"),
  matMult = getParameter(".matMult"), debugMode = getParameter(".debug", F),
  ...)

Arguments

darch

An instance of the class '>DArch.

trainData

The training data (inputs).

targetData

The target data (outputs).

bp.learnRate

Learning rates for backpropagation, length is either one or the same as the number of weight matrices when using different learning rates for each layer.

bp.learnRateScale

The learn rate is multiplied by this value after each epoch.

nesterovMomentum

See darch.nesterovMomentum parameter of darch.

dropout

See darch.dropout parameter of darch.

dropConnect

See darch.dropout.dropConnect parameter of darch.

matMult

Matrix multiplication function, internal parameter.

debugMode

Whether debug mode is enabled, internal parameter.

...

Further parameters.

Value

The trained deep architecture

Details

The only backpropagation-specific, user-relevant parameters are bp.learnRate and bp.learnRateScale; they can be passed to the darch function when enabling backpropagation as the fine-tuning function. bp.learnRate defines the backpropagation learning rate and can either be specified as a single scalar or as a vector with one entry for each weight matrix, allowing for per-layer learning rates. bp.learnRateScale is a single scalar which contains a scaling factor for the learning rate(s) which will be applied after each epoch.

Backpropagation supports dropout and uses the weight update function as defined via the darch.weightUpdateFunction parameter of darch.

References

Rumelhart, D., G. E. Hinton, R. J. Williams, Learning representations by backpropagating errors, Nature 323, S. 533-536, DOI: 10.1038/323533a0, 1986.

See Also

darch

Other fine-tuning functions: minimizeAutoencoder, minimizeClassifier, rpropagation

Examples

Run this code
# NOT RUN {
data(iris)
model <- darch(Species ~ ., iris, darch.fineTuneFunction = "backpropagation")
# }

Run the code above in your browser using DataLab