darch (version 0.12.0)

minimizeClassifier: Conjugate gradient for a classification network

Description

This function trains a '>DArch classifier network with the conjugate gradient method.

Usage

minimizeClassifier(darch, trainData, targetData,
  cg.length = getParameter(".cg.length"),
  cg.switchLayers = getParameter(".cg.length"),
  dropout = getParameter(".darch.dropout"),
  dropConnect = getParameter(".darch.dropout.dropConnect"),
  matMult = getParameter(".matMult"), debugMode = getParameter(".debug"),
  ...)

Arguments

darch

A instance of the class '>DArch.

trainData

The training data matrix.

targetData

The labels for the training data.

cg.length

Numbers of line search

cg.switchLayers

Indicates when to train the full network instead of only the upper two layers

dropout

See darch.dropout parameter of darch.

dropConnect

See darch.dropout.dropConnect parameter of darch.

matMult

Matrix multiplication function, internal parameter.

debugMode

Whether debug mode is enabled, internal parameter.

...

Further parameters.

Value

The trained '>DArch object.

Details

This function is build on the basis of the code from G. Hinton et. al. (http://www.cs.toronto.edu/~hinton/MatlabForSciencePaper.html - last visit 2016-04-30) for the fine tuning of deep belief nets. The original code is located in the files 'backpropclassify.m', 'CG_MNIST.m' and 'CG_CLASSIFY_INIT.m'. It implements the fine tuning for a classification net with backpropagation using a direct translation of the minimize function from C. Rassmussen (available at http://www.gatsby.ucl.ac.uk/~edward/code/minimize/ - last visit 2016-04-30) to R. The parameter cg.switchLayers is for the switch between two training type. Like in the original code, the top two layers can be trained alone until epoch is equal to epochSwitch. Afterwards the entire network will be trained.

minimizeClassifier supports dropout but does not use the weight update function as defined via the darch.weightUpdateFunction parameter of darch, so that weight decay, momentum etc. are not supported.

See Also

darch, fineTuneDArch

Other fine-tuning functions: backpropagation, minimizeAutoencoder, rpropagation

Examples

Run this code
# NOT RUN {
data(iris)
model <- darch(Species ~ ., iris,
 preProc.params = list(method = c("center", "scale")),
 darch.unitFunction = c("sigmoidUnit", "softmaxUnit"),
 darch.fineTuneFunction = "minimizeClassifier",
 cg.length = 3, cg.switchLayers = 5)
# }

Run the code above in your browser using DataLab