Computes the Laplacian matrix of a graph on the basis of an observed data matrix, where we assume the data to be Student-t distributed.
Laplacian matrix of a k-component graph with heavy-tailed data
Computes the Laplacian matrix of a graph on the basis of an observed data matrix, where we assume the data to be Student-t distributed.
learn_kcomp_heavytail_graph(
X,
k = 1,
heavy_type = "gaussian",
nu = NULL,
w0 = "naive",
d = 1,
beta = 1e-08,
update_beta = TRUE,
early_stopping = FALSE,
rho = 1,
update_rho = FALSE,
maxiter = 10000,
reltol = 1e-05,
verbose = TRUE,
record_objective = FALSE
)A list containing possibly the following elements:
laplacianestimated Laplacian matrix
adjacencyestimated adjacency matrix
thetaestimated Laplacian matrix slack variable
maxiternumber of iterations taken to reach convergence
convergenceboolean flag to indicate whether or not the optimization conv erged
beta_seqsequence of values taken by the hyperparameter beta until convergence
primal_lap_residualprimal residual for the Laplacian matrix per iteratio n
primal_deg_residualprimal residual for the degree vector per iteration
dual_residualdual residual per iteration
lagrangianLagrangian value per iteration
elapsed_timeTime taken to reach convergence
an n x p data matrix, where n is the number of observations and p is the number of nodes in the graph.
the number of components of the graph.
a string which selects the statistical distribution of the data . Valid values are "gaussian" or "student".
the degrees of freedom of the Student-t distribution. Must be a real number greater than 2.
initial vector of graph weights. Either a vector of length p(p-1)/2 or a string indicating the method to compute an initial value.
the nodes' degrees. Either a vector or a single value.
hyperparameter that controls the regularization to obtain a k-component graph
whether to update beta during the optimization.
whether to stop the iterations as soon as the rank constraint is satisfied.
constraint relaxation hyperparameter.
whether or not to update rho during the optimization.
maximum number of iterations.
relative tolerance as a convergence criteria.
whether to show a progress bar during the iterations.
whether to record the objective function per iteration.