me.weighted: EM algorithm with weights starting with M-step for parameterized MVN mixture models
Description
Implements the EM algorithm for fitting MVN mixture models parameterized by eigenvalue decomposition, when observations have weights, starting with the maximization step.
Usage
me.weighted(modelName, data, z, weights = NULL, prior = NULL, control = emControl(), Vinv = NULL, warn = NULL, ...)
Arguments
modelName
A character string indicating the model. The help file for
mclustModelNames
describes the available models.
data
A numeric vector, matrix, or data frame of observations.
Categorical variables are not allowed.
If a matrix or data frame, rows correspond to observations and
columns correspond to variables.
z
A matrix whose [i,k]
th entry is an initial estimate of the
conditional probability of the ith observation belonging to
the kth component of the mixture.
weights
A vector of positive weights, where the [i]
th entry is the weight
for the ith observation. If any of the weights are greater than one,
then they are scaled so that the maximum weight is one.
prior
Specification of a conjugate prior on the means and variances.
See the help file for priorControl
for further information.
The default assumes no prior.
control
A list of control parameters for EM. The defaults are set by the call
emControl
.
Vinv
If the model is to include a noise term, Vinv
is an estimate of the
reciprocal hypervolume of the data region. If set to a negative value
or 0, the model will include a noise term with the reciprocal hypervolume
estimated by the function hypvol
.
The default is not to assume a noise term in the model through the
setting Vinv=NULL
.
warn
A logical value indicating whether or not certain warnings
(usually related to singularity) should be issued when the
estimation fails. The default is set by warn
using
mclust.options
.
...
Catches unused arguments in indirect or list calls via do.call
.
Value
A list including the following components:
References
C. Fraley and A. E. Raftery (2002).
Model-based clustering, discriminant analysis, and density estimation.
Journal of the American Statistical Association 97:611-631. C. Fraley and A. E. Raftery (2005).
Bayesian regularization for normal mixture estimation and model-based
clustering.
Technical Report, Department of Statistics, University of Washington. C. Fraley and A. E. Raftery (2007).
Bayesian regularization for normal mixture estimation and model-based
clustering. Journal of Classification 24:155-181. C. Fraley, A. E. Raftery, T. B. Murphy and L. Scrucca (2012).
mclust Version 4 for R: Normal Mixture Modeling for Model-Based
Clustering, Classification, and Density Estimation.
Technical Report No. 597, Department of Statistics, University of Washington.See Also
me
,
meE
,...,
meVVV
,
em
,
mstep
,
estep
,
priorControl
,
mclustModelNames
,
mclustVariance
,
mclust.options
Examples
Run this codew <- rep(1,150)
w[1] <- 0
me.weighted(modelName = "VVV", data = iris[,-5], z = unmap(iris[,5]),weights=w)
Run the code above in your browser using DataLab