Learn R Programming

softmaxreg (version 1.2)

trainModel: Train softmax regression and classification model

Description

This function implements a feedforward neural networks with multiple hidden layers and a softmax final layer. For classification, set type as "class" and y as factor vector; for regression, set type as "raw" and y as matrix with dimension nObs * K. K denotes the group number.

Usage

trainModel(x, y, hidden, funName, maxit, rang, type, algorithm, rate, L2, penalty, threshold, batch)

Arguments

x
matrix or data frame of x input values.
y
vector of target values for 'class' type classfication and matrix or data frame of target values for 'raw' type regression.
hidden
vector of integers specifying the number of hidden nodes in each layer.
funName
activation function name of neuron, e.g. 'sigmoid', 'tanh', 'relu' etc. In default, it is set to 'sigmoid'.
maxit
maximum number of iterations.Default 3000.
rang
parameter for the range of initial random weights. Default 0.1 [-rang, rang].
type
parameter indicating the type of softmax task: "class" denotes the softmax classfication model and the fitted values are factors; "raw" denotes softmax regression model and the fitted values are the probability or percentage of each group. Default "class".
algorithm
parameter indicating which gradient descenting learning algorithm to use, including "sgd", "adagrad", "rmsprop", "adadelta", etc. Default "adagrad".
rate
parameter of learning rate. Default 0.05.
L2
Boolean variable indicating whether L2 regularization term is added to the loss function and gradient to prevent overfitting. Default FALSE.
penalty
Parameter for the penalty cost of the L2 regularization term if L2 is TRUE. Default 1e-4.
threshold
Parameter for the threshold of iteration convergence: loss value less than threshold. Default 1e-4.
batch
Parameter for mini-batch size. Default 50.

Value

object of class "softmax"

See Also

softmaxReg

Examples

Run this code
## Not run: 
# library(softmaxreg)
# data(iris)
# x = iris[,1:4]
# y = iris$Species
# sofmax_model = trainModel(x, y, hidden = c(5), funName = 'sigmoid', maxit = 3000,
#     rang = 0.1, type = "class", algorithm = "adagrad", rate = 0.05, threshold = 1e-3)
# summary(sofmax_model)
# yFitMat = sofmax_model$fitted.values
# yFit = c()
# for (i in 1:length(y)) {
# 	yFit = c(yFit, which(yFitMat[i,]==max(yFitMat[i,])))
# }
# table(y, yFit)
# ## End(Not run)

Run the code above in your browser using DataLab