Fuses a base learner with a search strategy to select variables. Creates a learner object, which can be used like any other learner object, but which internally uses selectFeatures. If the train function is called on it, the search strategy and resampling are invoked to select an optimal set of variables. Finally, a model is fitted on the complete training data with these variables and returned. See selectFeatures for more details.
After training, the optimal features (and other related information) can be retrieved with getFeatSelResult.
makeFeatSelWrapper(
learner,
resampling,
measures,
bit.names,
bits.to.features,
control,
show.info = getMlrOption("show.info")
)
Learner.
(Learner | character(1)
)
The learner.
If you pass a string the learner will be created via makeLearner.
(ResampleInstance | ResampleDesc)
Resampling strategy for feature selection. If you pass a description, it is
instantiated once at the beginning by default, so all points are evaluated
on the same training/test sets. If you want to change that behavior, look
at FeatSelControl.
(list of Measure | Measure)
Performance measures to evaluate. The first measure, aggregated by the first aggregation function
is optimized, others are simply evaluated.
Default is the default measure for the task, see here getDefaultMeasure.
character
Names of bits encoding the solutions. Also defines the total number of bits
in the encoding. Per default these are the feature names of the task. Has
to be used together with bits.to.features
.
(function(x, task)
)
Function which transforms an integer-0-1 vector into a character vector of
selected features. Per default a value of 1 in the ith bit selects the ith
feature to be in the candidate solution. The vector x
will correspond to
the bit.names
and has to be of the same length.
[see FeatSelControl) Control object for search method. Also selects the optimization algorithm for feature selection.
(logical(1)
)
Print verbose output on console?
Default is set via configureMlr.
Other featsel:
FeatSelControl
,
analyzeFeatSelResult()
,
getFeatSelResult()
,
selectFeatures()
Other wrapper:
makeBaggingWrapper()
,
makeClassificationViaRegressionWrapper()
,
makeConstantClassWrapper()
,
makeCostSensClassifWrapper()
,
makeCostSensRegrWrapper()
,
makeDownsampleWrapper()
,
makeDummyFeaturesWrapper()
,
makeExtractFDAFeatsWrapper()
,
makeFilterWrapper()
,
makeImputeWrapper()
,
makeMulticlassWrapper()
,
makeMultilabelBinaryRelevanceWrapper()
,
makeMultilabelClassifierChainsWrapper()
,
makeMultilabelDBRWrapper()
,
makeMultilabelNestedStackingWrapper()
,
makeMultilabelStackingWrapper()
,
makeOverBaggingWrapper()
,
makePreprocWrapperCaret()
,
makePreprocWrapper()
,
makeRemoveConstantFeaturesWrapper()
,
makeSMOTEWrapper()
,
makeTuneWrapper()
,
makeUndersampleWrapper()
,
makeWeightedClassesWrapper()
# nested resampling with feature selection (with a nonsense algorithm for selection)
outer = makeResampleDesc("CV", iters = 2L)
inner = makeResampleDesc("Holdout")
ctrl = makeFeatSelControlRandom(maxit = 1)
lrn = makeFeatSelWrapper("classif.ksvm", resampling = inner, control = ctrl)
# we also extract the selected features for all iteration here
r = resample(lrn, iris.task, outer, extract = getFeatSelResult)
Run the code above in your browser using DataLab