Fit SoftImpute/SoftSVD via fast alternating least squares. Based on the paper by Trevor Hastie, Rahul Mazumder, Jason D. Lee, Reza Zadeh by "Matrix Completion and Low-Rank SVD via Fast Alternating Least Squares" - https://arxiv.org/pdf/1410.2596.pdf
soft_impute(x, rank = 10L, lambda = 0, n_iter = 100L,
convergence_tol = 0.001, init = NULL, final_svd = TRUE)soft_svd(x, rank = 10L, lambda = 0, n_iter = 100L,
convergence_tol = 0.001, init = NULL, final_svd = TRUE)
sparse matrix. Both CSR dgRMatrix and CSC dgCMatrix are supported.
in case of CSR matrix we suggest to load https://github.com/dselivanov/MatrixCSR package
which provides multithreaded CSR*dense matrix products (if OpenMP is supported on your platform).
On many-cores machines this reduces fitting time significantly.
maximum rank of the low-rank solution.
regularization parameter for nuclear norm
maximum number of iterations of the algorithms
convergence tolerance. Internally we keep track relative change of frobenious norm of two consequent iterations.
svd like object with u, v, d components to initialize algorithm.
Algorithm benefit from warm starts. init could be rank up rank of the maximum allowed rank.
If init has rank less than max rank it will be padded automatically.
logical whether need to make final preprocessing with SVD.
This is not necessary but cleans up rank nicely - hithly recommnded to leave it TRUE.
svd-like object - list with u, v, d
components - left, right singular vectors and singular vectors.