funkSVD
Funk SVD for Matrices with Missing Data
Implements matrix decomposition by the stochastic gradient descent optimization popularized by Simon Funk to minimize the error on the known values.
 Keywords
 model
Usage
funkSVD(x, k = 10, gamma = 0.015, lambda = 0.001, min_improvement = 1e06, min_epochs = 50, max_epochs = 200, verbose = FALSE)
Arguments
 x
 a matrix, potentially containing NAs.
 k
 number of features (i.e, rank of the approximation).
 gamma
 regularization term.
 lambda
 learning rate.
 min_improvement
 required minimum improvement per iteration.
 min_epochs
 minimum number of iterations per feature.
 max_epochs
 maximum number of iterations per feature.
 verbose
 show progress.
Details
Funk SVD decomposes a matrix (with missing values) into two components $U$ and $V$. The singular values are folded into these matrices. The approximation for the original matrix can be obtained by $R = UV'$.
This function predict
in this implementation folds in new data rows
by estimating the $u$ vectors using gradient descend and then calculating
the reconstructed complete matrix r for these users via $r = uV'$.
Value

An object of class
"funkSVD"
with components
with componentsNote
The code is based on the implmentation in package rrecsys by Ludovik Coba and Markus Zanker.
References
Y. Koren, R. Bell, and C. Volinsky. Matrix Factorization Techniques for Recommender Systems, IEEE Computer, pp. 4249, August 2009.
Examples
### this takes a while to run
## Not run:
# data("Jester5k")
#
# train < as(Jester5k[1:100], "matrix")
# fsvd < funkSVD(train, verbose = TRUE)
#
# ### reconstruct the rating matrix as R = UV'
# ### and calculate the root mean square error on the known ratings
# r < tcrossprod(fsvd$U, fsvd$V)
# rmse(train, r)
#
# ### fold in new users for matrix completion
# test < as(Jester5k[101:105], "matrix")
# p < predict(fsvd, test, verbose = TRUE)
# rmse(test, p)
# ## End(Not run)