Last chance! 50% off unlimited learning
Sale ends in
Jacobian Matrix of Gradient Function for Training Datasets
Calculate the Jacobian matrix of gradient function for the training dataset. It takes input from neural network models and the gradient at each weight parameters. The matrix has dimension of R [nObs * nPara], nObs denotes the number of training observations and nPara denotes the number of weights parameters.
jacobian(object, ...)
"jacobian"(object, xTrain, funName = 'sigmoid',...)
"jacobian"(object, xTrain, funName = 'sigmoid',...)
"jacobian"(object, xTrain, funName = 'sigmoid',...)
Jacobian matrix with gradient function, in which J[ij] element denotes the gradient function at the jth weight parameters for the ith training observation. The dimension is equal to nObs * nPara.
library(nnet)
xTrain <- rbind(cbind(runif(150,min = 0, max = 0.5),runif(150,min = 0, max = 0.5)) ,
cbind(runif(150,min = 0.5, max = 1),runif(150,min = 0.5, max = 1))
)
nObs <- dim(xTrain)[1]
yTrain <- 0.5 + 0.4 * sin(2* pi * xTrain %*% c(0.4,0.6)) +rnorm(nObs,mean = 0, sd = 0.05)
# Training nnet models
net <- nnet(yTrain ~ xTrain,size = 3, rang = 0.1,decay = 5e-4, maxit = 500)
# Calculating Jacobian Matrix of the training samples
library(nnetpredint)
jacobMat = jacobian(net,xTrain)
dim(jacobMat)
Run the code above in your browser using DataLab