getHomogeneousEnsembleModels
.OverBagging is implemented as follows: For each iteration a random data subset is sampled. Minority class examples are oversampled with replacement with a given rate. Majority class examples are either simply copied into each bag, or bootstrapped with replacement until we have as many majority class examples as in the original training data. Features are currently not changed or sampled.
Prediction works as follows: For classification we do majority voting to create a discrete label and probabilities are predicted by considering the proportions of all predicted labels.
makeOverBaggingWrapper(learner, obw.iters = 10L, obw.rate = 1,
obw.maxcl = "boot")
Learner
].makeOversampleWrapper
,
makeUndersampleWrapper
;
oversample
, undersample
;
smote
Other wrapper: CostSensClassifModel
,
CostSensClassifWrapper
,
makeCostSensClassifWrapper
;
CostSensRegrModel
,
CostSensRegrWrapper
,
makeCostSensRegrWrapper
;
makeBaggingWrapper
;
makeDownsampleWrapper
;
makeFeatSelWrapper
;
makeFilterWrapper
;
makeImputeWrapper
;
makeMulticlassWrapper
;
makeOversampleWrapper
,
makeUndersampleWrapper
;
makePreprocWrapperCaret
;
makePreprocWrapper
;
makeSMOTEWrapper
;
makeTuneWrapper
;
makeWeightedClassesWrapper