Last chance! 50% off unlimited learning
Sale ends in
A function to build prediction model using Mini-Batch Gradient Descent (MBGD) method.
MBGD(dataTrain, alpha = 0.1, maxIter = 10, nBatch = 2, seed = NULL)
a data.frame that representing training data (
a float value representing learning rate. Default value is 0.1
the maximal number of iterations.
a integer value representing the training data batch.
a integer value for static random. Default value is NULL, which means the function will not do static random.
a vector matrix of theta (coefficient) for linear model.
This function based on GD
method with optimization to use
the training data partially. MBGD has a parameter named batchRate that represent
the instances percentage of training data.
A. Cotter, O. Shamir, N. Srebro, K. Sridharan Better Mini-Batch Algoritms via Accelerated Gradient Methods, NIPS, pp. 1647- (2011)
# NOT RUN {
##################################
## Learning and Build Model with MBGD
## load R Package data
data(gradDescentRData)
## get z-factor data
dataSet <- gradDescentRData$CompressilbilityFactor
## split dataset
splitedDataSet <- splitData(dataSet)
## build model with 0.8 batch rate MBGD
MBGDmodel <- MBGD(splitedDataSet$dataTrain, nBatch=2)
#show result
print(MBGDmodel)
# }
Run the code above in your browser using DataLab