SoftClassForest: Implementing a Random Forest of SDTs.
Description
SoftClassForest creates categorical Random Forests of Soft Decision Trees while returning
the fitted classification given by the majority vote of individual SDTs.
Usage
SoftClassForest(trainresponses, train, test, ntry, ntrees, depth,
bag = TRUE)
Arguments
trainresponses
A matrix or data frame of responses 0 and 1 for the training set with length equal to the number of observations in the training set and width equal to the number of possible classifications.
train
A matrix or data frame consisting of all possible variables to attempt for the training set.
test
A matrix or data frame consisting of all possible variables to attempt for the test set.
ntry
A numeric of the number of variables from the num.features to attempt to split. This is useful for building random forests. For a standard tree, choose ntry = num.features.
ntrees
A numeric of the number of SDTs to build in the Random Forest.
depth
A numeric of the number of the depth each SDT should be. Here this ends with \(2^{depth - 1}\) terminal nodes.
bag
Logical if Random Forests should be built with bootstrap aggregating (TRUE) or raw data (FALSE).
Value
A vector of the final classifications based on the Random Forest generated.
Details
SoftClassForest individually fits a Random Forest for each possible classification response using SoftForestPredFeeder function
one classification at a time. The result from each one of these SDTs is a fitted probability of 0 or 1.
Once all classifications have a fitted probability, the observation is classified as the maximum a posteriori probability.
Given a Random Forest of SDTs, the final Random Forest classification goes to the majority vote from the SDTs.