The prediction method for objects inheriting from the RuleSetRST
class.
# S3 method for RuleSetRST
predict(object, newdata, ...)
A data.frame with a single column containing predictions for objects from newdata
.
an object inheriting from the "RuleSetRST"
class. Such objects are typically produced
by implementations of rule induction methods, which derives from the rough set theory (RST), such as
RI.indiscernibilityBasedRules.RST
, RI.CN2Rules.RST
,
RI.LEM2Rules.RST
or RI.AQRules.RST
.
an object inheriting from the "DecisionTable"
class, which represents the data
for which predictions are to be made. See SF.asDecisionTable
. Columns in newdata
should correspond to columns of a data set used for the rule induction.
additional parameters for a rule voting strategy. It can be applied only to the methods which classify
new objects by voting. Currently, those methods include RI.LEM2Rules.RST
and
RI.AQRules.RST
which accept a named parameter votingMethod
. This parameter can be used
to pass a custom function for computing a weight of a voting rule. There are three such functions already
available in the package:
X.ruleStrength
is the default voting method. It is defined as a product of a cardinality
of a support of a rule and the length of this rule. See X.ruleStrength
.
X.laplace
corresponds to a voting weighted by the Laplace estimates of rules' confidence.
See X.laplace
.
X.rulesCounting
corresponds to voting by counting the matching rules for different decision
classes. See X.rulesCounting
.
A custom function passed using the votingMethod
can get additional parameters using the ...
interface.
Andrzej Janusz
Rule induction methods implemented within RST include: RI.indiscernibilityBasedRules.RST
,
RI.CN2Rules.RST
, RI.LEM2Rules.RST
and RI.AQRules.RST
.
For details on rule induction methods based on FRST see RI.GFRS.FRST
and RI.hybridFS.FRST
.
##############################################
## Example: Classification Task
##############################################
data(RoughSetData)
wine.data <- RoughSetData$wine.dt
set.seed(13)
wine.data <- wine.data[sample(nrow(wine.data)),]
## Split the data into a training set and a test set,
## 60% for training and 40% for testing:
idx <- round(0.6 * nrow(wine.data))
wine.tra <-SF.asDecisionTable(wine.data[1:idx,],
decision.attr = 14,
indx.nominal = 14)
wine.tst <- SF.asDecisionTable(wine.data[(idx+1):nrow(wine.data), -ncol(wine.data)])
true.classes <- wine.data[(idx+1):nrow(wine.data), ncol(wine.data)]
## discretization:
cut.values <- D.discretization.RST(wine.tra,
type.method = "unsupervised.quantiles",
nOfIntervals = 3)
data.tra <- SF.applyDecTable(wine.tra, cut.values)
data.tst <- SF.applyDecTable(wine.tst, cut.values)
## rule induction from the training set:
rules <- RI.LEM2Rules.RST(data.tra)
## predicitons for the test set:
pred.vals1 <- predict(rules, data.tst)
pred.vals2 <- predict(rules, data.tst,
votingMethod = X.laplace)
pred.vals3 <- predict(rules, data.tst,
votingMethod = X.rulesCounting)
## checking the accuracy of predictions:
mean(pred.vals1 == true.classes)
mean(pred.vals2 == true.classes)
mean(pred.vals3 == true.classes)
Run the code above in your browser using DataLab