Estimate mixture latent variable model
mixture(x, data, k = length(x), control = list(), vcov = "observed", ...)
List of lvm
objects. If only a single lvm
object is
given, then a k
-mixture of this model is fitted (free parameters
varying between mixture components).
data.frame
Number of mixture components
Optimization parameters (see details) # @param type Type of EM algorithm (standard, classification, stochastic)
of asymptotic covariance matrix (NULL to omit)
Additional arguments parsed to lower-level functions
Estimate parameters in a mixture of latent variable models via the EM algorithm.
The performance of the EM algorithm can be tuned via the control
argument, a list where a subset of the following members can be altered:
Optional starting values
Evaluate
nstart
different starting values and run the EM-algorithm on the
parameters with largest likelihood
Convergence tolerance of the
EM-algorithm. The algorithm is stopped when the absolute change in
likelihood and parameter (2-norm) between successive iterations is less than
tol
Maximum number of iterations of the EM-algorithm
Scale-down (i.e. number between 0 and 1) of the step-size of the Newton-Raphson algorithm in the M-step
Trace
information on the EM-algorithm is printed on every trace
th
iteration
Note that the algorithm can be aborted any time (C-c) and still be saved (via on.exit call).
mvnmix
# NOT RUN {
# }
# NOT RUN {
m0 <- lvm(list(y~x+z,x~z))
distribution(m0,~z) <- binomial.lvm()
d <- sim(m0,2000,p=c("y<-z"=2,"y<-x"=1),seed=1)
## unmeasured confounder example
m <- baptize(lvm(y~x));
covariance(m,~x) <- "v"
intercept(m,~x+y) <- NA
set.seed(42)
M <- mixture(m,k=2,data=d,control=list(trace=1,tol=1e-6))
summary(M)
lm(y~x,d)
estimate(M,"y~x")
## True slope := 1
# }
# NOT RUN {
# }
Run the code above in your browser using DataLab