
Implements matrix decomposition by the stochastic gradient descent optimization popularized by Simon Funk to minimize the error on the known values.
This function is used by the recommender method "SVDF" (see Recommender
).
funkSVD(x, k = 10, gamma = 0.015, lambda = 0.001,
min_improvement = 1e-06, min_epochs = 50, max_epochs = 200,
verbose = FALSE)
An object of class "funkSVD"
with components
the
the
a list with parameter values.
a matrix, potentially containing NAs.
number of features (i.e, rank of the approximation).
regularization term.
learning rate.
required minimum improvement per iteration.
minimum number of iterations per feature.
maximum number of iterations per feature.
show progress.
Funk SVD decomposes a matrix (with missing values)
into two components
This function predict
in this implementation folds in new data rows
by estimating the
Y. Koren, R. Bell, and C. Volinsky. Matrix Factorization Techniques for Recommender Systems, IEEE Computer, pp. 42-49, August 2009.
# this takes a while to run!
if (FALSE) {
data("Jester5k")
# helper to calculate root mean squared error
rmse <- function(pred, truth) sqrt(sum((truth-pred)^2, na.rm = TRUE))
train <- as(Jester5k[1:100], "matrix")
fsvd <- funkSVD(train, verbose = TRUE)
# reconstruct the original rating matrix as R = UV'
r <- tcrossprod(fsvd$U, fsvd$V)
rmse(train, r)
# fold in new users for matrix completion
test <- as(Jester5k[101:105], "matrix")
p <- predict(fsvd, test, verbose = TRUE)
rmse(test, p)
}
Run the code above in your browser using DataLab