A function to define a multilayer perceptron and compute quantities for backpropagation, if needed.
MLP_net(input, weights, bias, dims, nlayers, activ, back = TRUE, regulariser)
a list object containing the evaluated forward pass and also, if selected, quantities for backpropagation.
input data, a list of vectors (i.e. ragged array)
a list object containing weights for the forward pass, see ?weights2list
a list object containing biases for the forward pass, see ?bias2list
the dimensions of the network as stored from a call to the function network, see ?network
number of layers as stored from a call to the function network, see ?network
list of activation functions as stored from a call to the function network, see ?network
logical, whether to compute quantities for backpropagation (set to FALSE for feed-forward use only)
type of regularisation strategy to, see ?train, ?no_regularisation ?L1_regularisation, ?L2_regularisation
Ian Goodfellow, Yoshua Bengio, Aaron Courville, Francis Bach. Deep Learning. (2016)
Terrence J. Sejnowski. The Deep Learning Revolution (The MIT Press). (2018)
Neural Networks YouTube playlist by 3brown1blue: https://www.youtube.com/playlist?list=PLZHQObOWTQDNU6R1_67000Dx_ZCJB-3pi
http://neuralnetworksanddeeplearning.com/
network, train, backprop_evaluate, MLP_net, backpropagation_MLP, logistic, ReLU, smoothReLU, ident, softmax, Qloss, multinomial, NNgrad_test, weights2list, bias2list, biasInit, memInit, gradInit, addGrad, nnetpar, nbiaspar, addList, no_regularisation, L1_regularisation, L2_regularisation