
Implements matrix decomposition by the stochastic gradient descent optimization popularized by Simon Funk to minimize the error on the known values.
funkSVD(x, k = 10, gamma = 0.015, lambda = 0.001,
min_improvement = 1e-06, min_epochs = 50, max_epochs = 200,
verbose = FALSE)
a matrix, potentially containing NAs.
number of features (i.e, rank of the approximation).
regularization term.
learning rate.
required minimum improvement per iteration.
minimum number of iterations per feature.
maximum number of iterations per feature.
show progress.
An object of class "funkSVD"
with components
the
the
a list with parameter values.
Funk SVD decomposes a matrix (with missing values)
into two components
This function predict
in this implementation folds in new data rows
by estimating the
Y. Koren, R. Bell, and C. Volinsky. Matrix Factorization Techniques for Recommender Systems, IEEE Computer, pp. 42-49, August 2009.
# NOT RUN {
### this takes a while to run
# }
# NOT RUN {
data("Jester5k")
train <- as(Jester5k[1:100], "matrix")
fsvd <- funkSVD(train, verbose = TRUE)
### reconstruct the rating matrix as R = UV'
### and calculate the root mean square error on the known ratings
r <- tcrossprod(fsvd$U, fsvd$V)
rmse(train, r)
### fold in new users for matrix completion
test <- as(Jester5k[101:105], "matrix")
p <- predict(fsvd, test, verbose = TRUE)
rmse(test, p)
# }
Run the code above in your browser using DataLab