
bn.cv(data, bn, loss = NULL, k = 10, algorithm.args = list(),
loss.args = list(), fit = "mle", fit.args = list(), debug = FALSE)
bn
(a fixed network structure).loss
.bn.fit
for details.bn.fit
for details..TRUE
a lot of debugging output
is printed; otherwise the function is completely silent.bn.kcv
.
logl
): also known asnegative
entropyornegentropy, it's the negated expected log-likelihood
of the test set for the Bayesian network fitted from the training set.logl-g
): the negated expected
log-likelihood for Gaussian Bayesian networks.pred
): theprediction errorfor a single node (specified by thetarget
parameter inloss.args
)
in a discrete network.bn.boot
, rbn
, bn.kcv-class
.bn.cv(learning.test, 'hc', loss = "pred", loss.args = list(target = "F"))
#
# k-fold cross-validation for Bayesian networks
#
# target learning algorithm: Hill-Climbing
# number of subsets: 10
# loss function: Classification Error
# expected loss: 0.509
#
bn.cv(gaussian.test, 'mmhc')
#
# k-fold cross-validation for Bayesian networks
#
# target learning algorithm: Max-Min Hill Climbing
# number of subsets: 10
# loss function: Log-Likelihood Loss (Gaussian)
# expected loss: 10.63062
#
Run the code above in your browser using DataLab