Learn R Programming

NetworkToolbox (version 1.2.2)

kld: Kullback-Leibler Divergence

Description

Estimates the Kullback-Leibler Divergence which measures how one probability distribution diverges from the original distribution (equivalent means are assumed) Matrices must be positive definite inverse covariance matrix for accurate measurement. This is not a quantitative metric

Usage

kld(base, test)

Arguments

base

Full or base model

test

Reduced or testing model

Value

A value greater than 0. Smaller values suggest the probablity distribution of the reduced model is near the full model

References

Kullback, S., & Leibler, R. A. (1951). On information and sufficiency. The Annals of Mathematical Statistics, 22, 79-86. doi: 10.1214/aoms/1177729694

Examples

Run this code
# NOT RUN {
A1 <- solve(cov(neoOpen))

A2 <- LoGo(neoOpen)

kld_value <- kld(A1, A2)

# }

Run the code above in your browser using DataLab