Learn R Programming

bnlearn (version 2.6)

bn.cv: Cross-validation for Bayesian networks

Description

Perform a k-fold cross-validation for a learning algorithm or a fixed network structure.

Usage

bn.cv(data, bn, loss = NULL, k = 10, algorithm.args = list(),
  loss.args = list(), fit = "mle", fit.args = list(),
  cluster = NULL, debug = FALSE)

Arguments

Value

  • An object of class bn.kcv.

Details

The following loss functions are implemented:

  • Log-Likelihood Loss(logl): also known asnegative entropyornegentropy, it's the negated expected log-likelihood of the test set for the Bayesian network fitted from the training set.
  • Gaussian Log-Likelihood Loss(logl-g): the negated expected log-likelihood for Gaussian Bayesian networks.
  • Classification Error(pred): theprediction errorfor a single node (specified by thetargetparameter inloss.args) in a discrete network.

References

Koller D, Friedman N (2009). Probabilistic Graphical Models: Principles and Techniques. MIT Press.

See Also

bn.boot, rbn, bn.kcv-class.

Examples

Run this code
bn.cv(learning.test, 'hc', loss = "pred", loss.args = list(target = "F"))
#
#  k-fold cross-validation for Bayesian networks
#
#  target learning algorithm:             Hill-Climbing
#  number of subsets:                     10
#  loss function:                         Classification Error
#  expected loss:                         0.509
#
bn.cv(gaussian.test, 'mmhc')
#
#  k-fold cross-validation for Bayesian networks
#
#  target learning algorithm:             Max-Min Hill Climbing
#  number of subsets:                     10
#  loss function:                         Log-Likelihood Loss (Gaussian)
#  expected loss:                         10.63062
#

Run the code above in your browser using DataLab