Unlimited learning, half price | 50% off

Last chance! 50% off unlimited learning

Sale ends in


deepNN (version 1.2)

backpropagation_MLP: backpropagation_MLP function

Description

A function to perform backpropagation for a multilayer perceptron.

Usage

backpropagation_MLP(MLPNet, loss, truth)

Value

a list object containing the cost and the gradient with respect to each of the model parameters

Arguments

MLPNet

output from the function MLP_net, as applied to some data with given parameters

loss

the loss function, see ?Qloss and ?multinomial

truth

the truth, a list of vectors to compare with output from the feed-forward network

References

  1. Ian Goodfellow, Yoshua Bengio, Aaron Courville, Francis Bach. Deep Learning. (2016)

  2. Terrence J. Sejnowski. The Deep Learning Revolution (The MIT Press). (2018)

  3. Neural Networks YouTube playlist by 3brown1blue: https://www.youtube.com/playlist?list=PLZHQObOWTQDNU6R1_67000Dx_ZCJB-3pi

  4. http://neuralnetworksanddeeplearning.com/

See Also

network, train, backprop_evaluate, MLP_net, backpropagation_MLP, logistic, ReLU, smoothReLU, ident, softmax, Qloss, multinomial, NNgrad_test, weights2list, bias2list, biasInit, memInit, gradInit, addGrad, nnetpar, nbiaspar, addList, no_regularisation, L1_regularisation, L2_regularisation