Expectation-Maximization algorithm to calculate optimal Gaussian Mixture Model for given data in one Dimension.
EMGauss(Data, K, Means, SDs,Weights, MaxNumberofIterations,fast)
List with
means of GMM generated by EM algorithm
standard deviations of GMM generated by EM algorithm
prior probabilities of Gaussians
vector of data points
estimated amount of Gaussian Kernels
vector(1:L), Means of Gaussians, L == Number of Gaussians
estimated Gaussian Kernels = standard deviations
optional, relative number of points in Gaussians (prior probabilities): sum(Weights) ==1, default weight is 1/L
Optional, Number of Iterations; default=10
Default: FALSE: Using mclust's EM see function densityMclust
of that package, TRUE: Naive but faster EM implementation, which may be numerical unstable, because log(gauss) is not used
Onno Hansen-Goos, Michael Thrun, Florian Lerch
No adding or removing of Gaussian kernels. Number of Gaussian hast to be set by the length of the vector of Means, SDs and Weights.
This EM is only for univariate data. For multivariate data see package mclust
Bishop, Christopher M. Pattern recognition and machine learning. springer, 2006, p 435 ff
AdaptGauss