Learn R Programming

dad (version 4.1.6)

jeffreyspar: Jeffreys measure between Gaussian densities given their parameters

Description

Jeffreys measure (or symmetrised Kullback-Leibler divergence) between two multivariate (\(p > 1\)) or univariate (\(p = 1\)) Gaussian densities, given their parameters (mean vectors and covariance matrices if they are multivariate, means and variances if univariate) (see Details).

Usage

jeffreyspar(mean1, var1, mean2, var2, check = FALSE)

Value

Jeffreys measure between two Gaussian densities.

Be careful! If check = FALSE and one covariance matrix is degenerated (multivariate case) or one variance is zero (univariate case), the result returned must not be considered.

Arguments

mean1

\(p\)-length numeric vector: the mean of the first Gaussian density.

var1

\(p\) x \(p\) symmetric numeric matrix (\(p\) > 1) or numeric (\(p\) = 1): the covariance matrix (\(p\) > 1) or the variance (\(p\) = 1) of the first Gaussian density.

mean2

\(p\)-length numeric vector: the mean of the second Gaussian density.

var2

\(p\) x \(p\) symmetric numeric matrix (\(p\) > 1) or numeric (\(p\) = 1): the covariance matrix (\(p\) > 1) or the variance (\(p\) = 1) of the second Gaussian density.

check

logical. When TRUE (the default is FALSE) the function checks if the covariance matrices are not degenerate (multivariate case) or if the variances are not zero (univariate case).

Author

Rachid Boumaza, Pierre Santagostini, Smail Yousfi, Gilles Hunault, Sabine Demotes-Mainard

Details

Let \(m1\) and \(m2\) the mean vectors, \(v1\) and \(v2\) the covariance matrices, Jeffreys measure of the two Gaussian densities is equal to:

$$(1/2) t(m1 - m2) (v1^{-1} + v2^{-1}) (m1 - m2) - (1/2) tr( (v1 - v2) (v1^{-1} - v2^{-1}) )$$.

If \(p = 1\) the means and variances are numbers, the formula is the same ignoring the following operators: t (transpose of a matrix or vector) and tr (trace of a square matrix).

References

McLachlan, G.J. (1992). Discriminant analysis and statistical pattern recognition. John Wiley & Sons, New York .

Thabane, L., Safiul Haq, M. (1999). On Bayesian selection of the best population using the Kullback-Leibler divergence measure. Statistica Neerlandica, 53(3): 342-360.

See Also

jeffreys: Jeffreys measure of two parametrically estimated Gaussian densities, given samples.

Examples

Run this code
m1 <- c(1,1)
v1 <- matrix(c(4,1,1,9),ncol = 2)
m2 <- c(0,1)
v2 <- matrix(c(1,0,0,1),ncol = 2)
jeffreyspar(m1,v1,m2,v2)

Run the code above in your browser using DataLab