Learn R Programming

GMCM (version 1.1.1)

EMAlgorithm: EM algorithm for Gaussian mixture models

Description

The regular expectation-maximization algorithm for general multivariate Gaussian mixture models.

Usage

EMAlgorithm(x, theta, eps = 1e-06, max.ite = 1e+05, trace.theta = FALSE,
  verbose = FALSE)

Arguments

x
A matrix of observations where rows correspond to features and columns to experiments.
theta
A list of parameters as described in rtheta.
eps
The maximal required difference in sucessive likelihoods to establish convergence.
max.ite
The maximum number of interations.
trace.theta
Logical. If TRUE, all estimates are stored and returned. Default is FALSE.
verbose
Set to TRUE for verbose output. Default is FALSE.

Value

  • A list of length 3 with elements:
  • thetaA list of the estimated parameters as described in rtheta.
  • loglik.trA numeric vector of the log-likelihood trace.
  • kappaA matrix where kappa[i,j] is the probability that x[i, ] is realized from the j'th component.

Details

Though not as versatile, the algorithm can be a faster alternative to Mclust in the mclust-package.

See Also

rtheta, PseudoEMAlgorithm

Examples

Run this code
set.seed(10)
data <- SimulateGMCMData(n = 1000, d = 2, m = 3)
start.theta <- rtheta(d = 2, m = 3)
res <- GMCM:::EMAlgorithm(data$z, theta = start.theta)

par(mfrow = c(1,2))
plot(data$z, cex = 0.5, pch = 16, main = "Simulated data",
     col = rainbow(3)[data$K])
plot(data$z, cex = 0.5, pch = 16, main = "GMM clustering",
     col = rainbow(3)[apply(res$kappa,1,which.max)])

Run the code above in your browser using DataLab