Fits a deep Gaussian mixture model to multivariate data.
deepgmm(y, layers, k, r,
it = 250, eps = 0.001, init = "kmeans", init_est = "factanal",
seed = NULL, scale = TRUE)
An object of class "dgmm"
containing fitted values.
It contains
A list in which the ith element is the loading matrix for the ith layer
A list containing mixing proportions for each layer. (i.e. the element w[[i]][j] contain the mixing proportion of the jth component in the i layer.)
A list of matrices containing components means in the columns. (i.e. the element mu[[i]][, j] contain the component mean of the jth component in the i layer.)
A list of arrays which contain covariance matrices for the random error components of each component (i.e. the element psi[[i]][j, ,, ] contain the error covariance matrix for the jth component in the i layer.)
The log-likelihood after each EM iteration
The Bayesian information criterion for the model fit
The Akaike information criterion for the model fit
The Classification likelihood information criterion for the model fit
The integrated classification criterion for the model fit
Clustering of the observations
Value of the seed used
A matrix or a data frame in which the rows correspond to observations and the columns to variables.
The number of layers in the deep Gaussian mixture model. Limited to 1, 2 or 3.
A vector of integers of length layers
containing the number of groups in the different layers.
A vector of integers of length layers
containing the dimensions at the different layers.
Dimension of the layers must be in decreasing
size. See details.
Maximum number of EM iterations.
The EM algorithm terminates if the relative increment of the log-likelihood falls below this value.
Procedure to obtain an initial partition of the observations. See Details.
Procedure for computing the initial parameter values for the given initial partition of the data. See Details.
Integer value to be passed to the set.seed
function at the biginning of
the deepgmm
function.
If scale = TRUE
, the columns of data, y
, will be scaled to
zero mean and unit variance.
Cinzia Viroli, Geoffrey J. McLachlan
The deep Gaussian mixture model is an hierarchical model organized in a multilayered architecture where, at each layer, the variables follow a mixture of Gaussian distributions. This set of nested mixtures of linear models provides a globally nonlinear model that can model the data in a very flexible way. In order to avoid overparameterized solutions, dimension reduction by factor models can be applied at each layer of the architecture, thus resulting in deep mixtures of factor analyzers.
The data y
must be a matrix or a data frame containing
numerical values, with no missing values. The rows must correspond to
observations and the columns to variables.
Presently, the maximum number of layers layers
implemented
is 3.
The ith element of k
contain number of groups in the ith layer. Thus
the length k
must equal to layers
.
The parameter vector r
contains the latent variable dimension of
each layer.
Variables at different layers have progressively decreasing dimension,
\(r_1\), \(r_2\), ..., \(r_h\), where \(p > r_1 > r_2 >
\dots > r_h \geq 1\).
The EM algorithm used by dgmm
requires initialization.
The initialization is done by first partitioning the dataset,
and then estimating the initial values for model parameters
based on the partition.
There are four options available in dgmm
for the
initial partitioning of the data;
random partitioning (init = "random"
),
clustering using the k-means algorithm of "Hartigan-Wong"
(init = "kmeans"
),
agglomerative hierarchical clustering (init = "hclass"
).
and Gaussian mixture model based clustering
(init = "mclust"
).
After the initial partitioning has been chosen, initial values of
the parameters in the component analyzers need to be
calculated. Curently only one option available.
This default option, init_est = "factanal"
provides initial
estimates of the parameters based on factor analysis.
Viroli, C. and McLachlan, G.J. (2019). Deep Gaussian mixture models. Statistics and Computing 29, 43-51.
# \donttest{
layers <- 2
k <- c(3, 4)
r <- c(3, 2)
it <- 50
eps <- 0.001
y <- scale(mtcars)
set.seed(1)
fit <-deepgmm(y = y, layers = layers, k = k, r = r,
it = it, eps = eps)
fit
summary(fit)
# }
Run the code above in your browser using DataLab