recommenderlab (version 0.2-1)

funkSVD: Funk SVD for Matrices with Missing Data

Description

Implements matrix decomposition by the stochastic gradient descent optimization popularized by Simon Funk to minimize the error on the known values.

Usage

funkSVD(x, k = 10, gamma = 0.015, lambda = 0.001, min_improvement = 1e-06, min_epochs = 50, max_epochs = 200, verbose = FALSE)

Arguments

x
a matrix, potentially containing NAs.
k
number of features (i.e, rank of the approximation).
gamma
regularization term.
lambda
learning rate.
min_improvement
required minimum improvement per iteration.
min_epochs
minimum number of iterations per feature.
max_epochs
maximum number of iterations per feature.
verbose
show progress.

Value

An object of class "funkSVD" with components with components

Details

Funk SVD decomposes a matrix (with missing values) into two components $U$ and $V$. The singular values are folded into these matrices. The approximation for the original matrix can be obtained by $R = UV'$.

This function predict in this implementation folds in new data rows by estimating the $u$ vectors using gradient descend and then calculating the reconstructed complete matrix r for these users via $r = uV'$.

References

Y. Koren, R. Bell, and C. Volinsky. Matrix Factorization Techniques for Recommender Systems, IEEE Computer, pp. 42-49, August 2009.

Examples

Run this code
### this takes a while to run
## Not run: 
# data("Jester5k")
# 
# train <- as(Jester5k[1:100], "matrix")
# fsvd <- funkSVD(train, verbose = TRUE)
# 
# ### reconstruct the rating matrix as R = UV'
# ### and calculate the root mean square error on the known ratings
# r <- tcrossprod(fsvd$U, fsvd$V)
# rmse(train, r)
# 
# ### fold in new users for matrix completion
# test <- as(Jester5k[101:105], "matrix")
# p <- predict(fsvd, test, verbose = TRUE)
# rmse(test, p)
# ## End(Not run)

Run the code above in your browser using DataLab