Learn R Programming

BayesNetBP (version 1.3.0)

ComputeKLDs: Compute signed and symmetric Kullback-Leibler divergence

Description

Compute signed and symmetric Kullback-Leibler divergence of variables over a spectrum of evidence

Usage

ComputeKLDs(tree, var0, vars, seq, pbar = TRUE, method = "gaussian")

Arguments

tree

a '>ClusterTree object

var0

the variable to have evidence absrobed

vars

the variables to have divergence computed

seq

a vector of numeric values as the evidences

pbar

logical(1) whether to show progress bar

method

method for divergence computation: gaussian for Gaussian approximation, for Monte Carlo integration

Value

a data.frame of the divergence

Details

Compute signed and symmetric Kullback-Leibler divergence of variables over a spectrum of evidence. The signed and symmetric Kullback-Leibler divergence is also known as Jeffery's signed information (JSI) for continuous variables.

Examples

Run this code
# NOT RUN {
data(liver)
tree.init.p <- Initializer(dag=liver$dag, data=liver$data, 
                           node.class=liver$node.class, 
                           propagate = TRUE)
klds <- ComputeKLDs(tree=tree.init.p, var0="Nr1i3", 
                    vars=setdiff(tree.init.p@node, "Nr1i3"),
                    seq=seq(-3,3,0.5))
head(klds)
# }

Run the code above in your browser using DataLab