
Last chance! 50% off unlimited learning
Sale ends in
Learn parameters with maximum likelihood or Bayesian estimation, the
weighting attributes to alleviate naive bayes' independence assumption (WANBIA),
attribute weighted naive Bayes (AWNB), or the model averaged naive Bayes
(MANB) methods. Returns a bnc_bn
.
lp(
x,
dataset,
smooth,
awnb_trees = NULL,
awnb_bootstrap = NULL,
manb_prior = NULL,
wanbia = NULL
)
A bnc_bn
object.
The bnc_dag
object. The Bayesian network classifier
structure.
The data frame from which to learn network parameters.
A numeric. The smoothing value (
An integer. The number (
A numeric. The size of the bootstrap subsample,
relative to the size of dataset
(given in [0,1]).
A numeric. The prior probability for an arc between the class and any feature.
A logical. If TRUE
, WANBIA feature weighting is
performed.
lp
learns the parameters of each local distribution dataset
in which
WANBIA learns a unique exponent 'weight' per feature. They are
computed by optimizing conditional log-likelihood, and are bounded with
all wanbia
to TRUE
.
In order to get the AWNB parameter estimate, provide either the
awnb_bootstrap
and/or the awnb_trees
argument. The estimate is:
dataset
and
The MANB parameters correspond to Bayesian model averaging over the naive
Bayes models obtained from all manb_prior
argument.
Hall M (2004). A decision tree-based attribute weighting filter for naive Bayes. Knowledge-based Systems, 20(2), 120-126.
Dash D and Cooper GF (2002). Exact model averaging with naive Bayesian classifiers. 19th International Conference on Machine Learning (ICML-2002), 91-98.
Pigott T D (2001) A review of methods for missing data. Educational research and evaluation, 7(4), 353-383.
data(car)
nb <- nb('class', car)
# Maximum likelihood estimation
mle <- lp(nb, car, smooth = 0)
# Bayesian estimaion
bayes <- lp(nb, car, smooth = 0.5)
# MANB
manb <- lp(nb, car, smooth = 0.5, manb_prior = 0.5)
# AWNB
awnb <- lp(nb, car, smooth = 0.5, awnb_trees = 10)
Run the code above in your browser using DataLab