The fine tuning function for deep architectures. This function use the
function saved in the attribute fineTuneFunction
to train the deep
architecture.
fineTuneDArch(darch, dataSet, dataSetValid = NULL, numEpochs = 1,
bootstrap = T, isBin = FALSE, isClass = TRUE, stopErr = -Inf,
stopClassErr = 101, stopValidErr = -Inf, stopValidClassErr = 101, ...)
The number of training iterations
Whether to use bootstrapping to create validation data.
Indicates whether the output data must be interpreted as boolean
value. Default is FALSE
. If it is true, every value over 0.5 is
interpreted as 1 and under as 0.
Indicates whether the training is for a classification net.
When TRUE
then statistics for classification will be determind.
Default is TRUE
Stop criteria for the error on the train data. Default is
-Inf
Stop criteria for the classification error on the train
data. Default is 101
Stop criteria for the error on the validation data.
Default is -Inf
.
Stop criteria for the classification error on the
validation data. Default is 101
.
Additional parameters for the training function
The function trains the given network darch
with the function
saved in the attribute fineTuneFunction
of the
'>DArch
-Object. The data (trainData
,
validData
, testData
) and belonging classes of the data
(targetData
, validTargets
, testTargets
) can be hand
over either as matrix or as ff-matrix (see package ff for details).
The data and classes for validation and testing are optional. If they are
provided the network will be executed with this datasets and statistics
will be calculated. This statistics are saved in the stats
attribute
(see '>Net
). The attribute isBin
indicates
whether the output data must be interpreted as binary value. If true every
value over 0.5 is interpreted as 1 otherwise as 0. Also it is possible to
set stop criteria for the training on the error (stopErr
,
stopValidErr
) or the correct classifications (stopClassErr
,
stopValidClassErr
) of the training or validation dataset.
'>DArch
, '>Net
,
backpropagation
, rpropagation
,
minimizeAutoencoder
, minimizeClassifier