Learn R Programming

nethet (version 1.4.2)

mixglasso: mixglasso

Description

mixglasso

Usage

mixglasso(x, n.comp, lambda = sqrt(2 * nrow(x) * log(ncol(x)))/2,
  pen = "glasso.parcor", init = "kmeans.hc", my.cl = NULL,
  modelname.hc = "VVV", nstart.kmeans = 1, iter.max.kmeans = 10,
  term = 10^{     -3 }, min.compsize = 5, save.allfits = FALSE,
  filename = "mixglasso_fit.rda", mc.flag = FALSE, mc.set.seed = FALSE,
  mc.preschedule = FALSE, mc.cores = getOption("mc.cores", 2L), ...)

Arguments

x
Input data matrix
n.comp
Number of mixture components. If n.comp is a vector, mixglasso will estimate a model for each number of mixture components, and return a list of models, as well as their BIC and MMDL scores and the index of the best model according to each score.
lambda
Regularization parameter. Default=sqrt(2*n*log(p))/2
pen
Determines form of penalty: glasso.parcor (default) to penalise the partial correlation matrix, glasso.invcov to penalise the inverse covariance matrix (this corresponds to classical graphical lasso), glasso.invcor to penalise the inverse correlation matrix.
init
Initialization. Method used for initialization init={'cl.init','r.means','random','kmeans','kmeans.hc','hc'}. Default='kmeans'
my.cl
Initial cluster assignments; need to be provided if init='cl.init' (otherwise this param is ignored). Default=NULL
modelname.hc
Model class used in hc. Default="VVV"
nstart.kmeans
Number of random starts in kmeans; default=1
iter.max.kmeans
Maximal number of iteration in kmeans; default=10
term
Termination criterion of EM algorithm. Default=10^-3
min.compsize
Stop EM if any(compsize)
save.allfits
If TRUE, save output of mixglasso for all k's.
filename
If save.allfits is TRUE, output of mixglasso will be saved as paste(filename, _fit.mixgl_k.rda, sep='').
mc.flag
If TRUE use parallel execution for each n.comp via function mclapply of package parallel.
mc.set.seed
See mclapply. Default=FALSE
mc.preschedule
See mclapply. Default=FALSE
mc.cores
Number of cores to use in parallel execution. Defaults to mc.cores option if set, or 2 otherwise.
...
Other arguments. See mixglasso_init

Value

  • A list with elements:
  • modelsList with each element i containing an S3 object of class 'nethetclustering' that contains the result of fitting the mixture graphical lasso model with n.comps[i] components. See the documentation of mixglasso_ncomp_fixed for the description of this object.
  • bicBIC for all fits.
  • mmdlMinimum description length score for all fits.
  • compComponent assignments for all fits.
  • bix.optIndex of model with optimal BIC score.
  • mmdl.optIndex of model with optimal MMDL score.

Details

Runs mixture of graphical lasso network clustering with one or several numbers of mixture components.

Examples

Run this code
###########################################
##This an example of how to use MixGLasso##
###########################################

##generate data
set.seed(1)
n <- 1000
n.comp <- 3
p <- 10

# Create different mean vectors
Mu <- matrix(0,p,n.comp)

nonzero.mean <- split(sample(1:p),rep(1:n.comp,length=p))
for(k in 1:n.comp){
  Mu[nonzero.mean[[k]],k] <- -2/sqrt(ceiling(p/n.comp))
}

sim <- sim_mix_networks(n, p, n.comp, Mu=Mu)

##run mixglasso
set.seed(1)
fit1 <-  mixglasso(sim$data,n.comp=1:6)
fit1$bic
set.seed(1)
fit2 <-  mixglasso(sim$data,n.comp=6)
fit2$bic
set.seed(1)
fit3 <-  mixglasso(sim$data,n.comp=1:6,lambda=0)
set.seed(1)
fit4 <-  mixglasso(sim$data,n.comp=1:6,lambda=Inf)
#set.seed(1)
#fit5 <-  bwprun_mixglasso(sim$data,n.comp=1,n.comp.max=5,selection.crit='bic')
#plot(fit5$selcrit,ylab='bic',xlab='Num.Comps',type='b')

##compare bic
library('ggplot2')
plotting.frame <- data.frame(BIC= c(fit1$bic, fit3$bic, fit4$bic), 
														 Num.Comps=rep(1:6, 3), Lambda=rep( 
														 	                        c('Default', 
														 													  'Lambda = 0',
														 														'Lambda = Inf'),
														 													each=6))

p <- ggplot(plotting.frame) + 
	geom_line(aes(x=Num.Comps, y=BIC, colour=Lambda))

print(p)

Run the code above in your browser using DataLab