emMixHMMR implements the maximum-likelihood parameter estimation of a mixture of HMMR models by the Expectation-Maximization (EM) algorithm.
emMixHMMR(X, Y, K, R, p = 3, variance_type = c("heteroskedastic",
"homoskedastic"), order_constraint = TRUE, init_kmeans = TRUE,
n_tries = 1, max_iter = 1000, threshold = 1e-06, verbose = FALSE)
Numeric vector of length m representing the covariates/inputs \(x_{1},\dots,x_{m}\).
Matrix of size \((n, m)\) representing the observed
responses/outputs. Y
consists of n functions of X
observed at
points \(1,\dots,m\).
The number of clusters (Number of HMMR models).
The number of regimes (HMMR components) for each cluster.
Optional. The order of the polynomial regression. By default, p
is
set at 3.
Optional. character indicating if the model is "homoskedastic" or "heteroskedastic". By default the model is "heteroskedastic".
Optional. A logical indicating whether or not a mask
of order one should be applied to the transition matrix of the Markov chain
to provide ordered states. For the purpose of segmentation, it must be set
to TRUE
(which is the default value).
Optional. A logical indicating whether or not the curve partition should be initialized by the K-means algorithm. Otherwise the curve partition is initialized randomly.
Optional. Number of runs of the EM algorithm. The solution providing the highest log-likelihood will be returned.
If n_tries
> 1, then for the first run, parameters are initialized by
uniformly segmenting the data into K segments, and for the next runs,
parameters are initialized by randomly segmenting the data into K
contiguous segments.
Optional. The maximum number of iterations for the EM algorithm.
Optional. A numeric value specifying the threshold for the relative difference of log-likelihood between two steps of the EM as stopping criteria.
Optional. A logical value indicating whether or not values of the log-likelihood should be printed during EM iterations.
EM returns an object of class ModelMixHMMR.
emMixHMMR function implements the EM algorithm. This function starts
with an initialization of the parameters done by the method initParam
of
the class ParamMixHMMR, then it alternates between the
E-Step (method of the class StatMixHMMR) and the M-Step
(method of the class ParamMixHMMR) until convergence (until
the relative variation of log-likelihood between two steps of the EM
algorithm is less than the threshold
parameter).
# NOT RUN {
data(toydataset)
x <- toydataset$x
Y <- t(toydataset[,2:ncol(toydataset)])
mixhmmr <- emMixHMMR(X = x, Y = Y, K = 3, R = 3, p = 1, verbose = TRUE)
mixhmmr$summary()
mixhmmr$plot()
# }
Run the code above in your browser using DataLab