Computes the log-likelihood of observed categorical data under a Latent Class Analysis (LCA) model given class probabilities and conditional response probabilities. The calculation assumes local independence of responses conditional on latent class membership.
get.Log.Lik.LCA(response, P.Z, par)A single numeric value representing the total log-likelihood: $$\log \mathcal{L} = \sum_{n=1}^N \log \left[ \sum_{l=1}^L \pi_l \prod_{i=1}^I P(X_{ni} = x_{ni} \mid Z=l) \right]$$
where \(x_{ni}\) is the standardized (0-based) response for person \(n\) on item \(i\).
A numeric matrix of dimension \(N \times I\) containing discrete responses. Values can be any categorical encoding (e.g., 1/2/3, A/B/C, or 0/1). The function automatically:
Converts all responses to 0-based integer encoding internally
Determines the maximum number of categories (\(K_{\max}\)) across items
A numeric vector of length \(L\) containing prior probabilities for latent classes. Must satisfy:
\(\sum_{l=1}^L \pi_l = 1\)
\(\pi_l > 0\) for all \(l = 1, \dots, L\)
A 3-dimensional array of dimension \(L \times I \times K_{\max}\) containing conditional probabilities, where \(par[l, i, k]\) represents \(P(X_i = k-1 \mid Z=l)\) (after internal 0-based re-encoding). Must satisfy:
For each class \(l\) and item \(i\): \(\sum_{k=1}^{K_i} par[l,i,k] = 1\)
Probabilities for non-existent categories (where \(k > K_i\)) are ignored but must be present in the array
The log-likelihood calculation follows these steps:
Response Standardization:
Original responses are converted to 0-based integers
using adjust.response.
For example, original values {1,2,5} become {0,1,2}
(ordered and relabeled sequentially).
Class-Specific Likelihood:
For each observation \(n\) and class \(l\), compute:
$$P(\mathbf{X}_n \mid Z_n=l) = \prod_{i=1}^I P(X_{ni} = x_{ni} \mid Z_n=l)$$
where \(x_{ni}\) is the standardized response value, and probabilities are taken from par[l, i, x_{ni}+1].
Marginal Likelihood: For each observation \(n\), combine class-specific likelihoods weighted by class probabilities: $$P(\mathbf{X}_n) = \sum_{l=1}^L \pi_l \cdot P(\mathbf{X}_n \mid Z_n=l)$$
Log Transformation: Sum log-transformed marginal likelihoods across all observations: $$\log \mathcal{L} = \sum_{n=1}^N \log P(\mathbf{X}_n)$$