Learn R Programming

MEET (version 5.1.1)

divergence.Shannon: Divergencia.Shannon: Mutual Information

Description

This function calculates Mutual Information (Renyi Order equal 1) by means of Kullback-Leibler divergence

Usage

divergence.Shannon(training.set, H, HXY,correction)

Arguments

training.set
A set of aligned nucleotide sequences
H
Entropy
HXY
Joint Entropy
correction
Correction of the Finite Sample Size Effect

Details

Renyi Order has to be equal 1.

See Also

divergence.Renyi, PredictDivergence, kfold.divergence

Examples

Run this code
require("MEET")
data(TranscriptionFactor)
data(BackgroundOrganism)
data(iicc)
q<-1
training.set<-TranscriptionFactor
correction<-correction.entropy(q,p=nrow(training.set),long=1,iicc)
HXmax<-entropy.Shannon(as.matrix(Prob))
pmX<-probability(training.set,Prob)
Probtrans<-probability.couple(Prob)
H<-entropy.Shannon(pmX)
pmXY<-joint.probability(training.set, Prob, Probtrans)
HXY<-entropy.joint(pmXY,q,iicc)
divergence.Shannon(training.set,H,HXY,correction)

Run the code above in your browser using DataLab