powered by
[Under development] Transfer learning
transferLearning( plpResult, plpData, population, fixLayers = T, includeTop = F, addLayers = c(100, 10), layerDropout = c(T, T), layerActivation = c("relu", "softmax"), outcomeWeight = 1, batchSize = 10000, epochs = 20 )
The plp result when training a kersa deep learning model on big data
The new data to fine tune the model on
The population for the new data
boolean specificying whether to fix weights in model being transferred
If TRUE the final layer of the model being transferred is removed
vector specifying nodes in each layer to add e.g. c(100,10) will add another layer with 100 nodels and then a final layer with 10
Add dropout to each new layer (binary vector length of addLayers)
Activation function for each new layer (string vector length of addLayers)
The weight to assign the class 1 when training the model
Size of each batch for updating layers
Number of epoches to run
# NOT RUN { modelSet <- setDeepNN() plpResult <- runPlp(plpData, population, modelSettings = modelSet, ...) transferLearning(...) # }
Run the code above in your browser using DataLab