Learn R Programming

frbs (version 2.0-0)

frbs.learn: The frbs model building function

Description

This is one of the central functions of the package. This function is used to generate/learn the model from numerical data using fuzzy rule-based systems.

Usage

frbs.learn(data.train, range.data = NULL,
    method.type = c("WM"), control = list())

Arguments

data.train
a data frame or matrix (m x n) of data for the training process, where m is the number of instances and n is the number of variables; the last column is the output variable. It should be noted that the training data must be expressed in numbers (n
range.data
a matrix(2 x n) containing the range of the data, where n is the number of variables, and first and second rows are the minimum and maximum values, respectively. It should be noted that for "FRBCS.W", "FRBCS.CHI", "GFS.GCCL", "FH.GBML", and "SLAVE
method.type
this parameter determines the learning algorithm to be used. The following methods are implemented:
  • "WM": Wang and Mendel's technique to handle regression tasks;
  • "SBC": subtractive clustering method to handle regression tasks;
control
a list containing all arguments, depending on the learning algorithm to use.

WM method

  • num.labels: a positive integer to determine the number of labels (fuzzy terms). The default value is 7.
  • type.mf: the type of the membe

Value

Details

This function makes accessible all fourteen learning methods that are implemented in this package. All of the methods use this function as interface for the learning stage, so users do not need to call other functions in the learning phase. In order to obtain good results, users need to adjust some parameters such as the number of labels, the type of the shape of the membership function, the maximal number of iterations, the step size of the gradient descent, or other method-dependent parameters which are collected in the control parameter. After creating the model using this function, it can be used to predict new data with predict.

See Also

predict for the prediction phase, and the following main functions of each of the methods for theoretical background and references: WM, SBC, HyFIS, ANFIS, FIR.DM, DENFIS, FS.HGD, FRBCS.W, FRBCS.CHI, GFS.FR.MOGUL, GFS.Thrift, GFS.GCCL, FH.GBML, and SLAVE.

Examples

Run this code
##################################
## I. Regression Problem
## Suppose data have two input variables and one output variable.
## We separate them into training, fitting, and testing data.
## data.train, data.fit, data.test, and range.data are inputs
## for all regression methods.
###################################
## Take into account that the simulation might take a long time
## depending on the hardware you are using. The chosen parameters
## may not be optimal.
## Data must be in data.frame or matrix form and the last column
## is the output variable/attribute.
## The training data must be expressed in numbers (numerical data).
data.train <- matrix(c(5.2, -8.1, 4.8, 8.8, -16.1, 4.1, 10.6, -7.8, 5.5, 10.4, -29.0,
                      5.0, 1.8, -19.2, 3.4, 12.7, -18.9, 3.4, 15.6, -10.6, 4.9, 1.9,
                      -25.0, 3.7, 2.2, -3.1, 3.9, 4.8, -7.8, 4.5, 7.9, -13.9, 4.8,
                      5.2, -4.5, 4.9, 0.9, -11.6, 3.0, 11.8, -2.1, 4.6, 7.9, -2.0,
                      4.8, 11.5, -9.0, 5.5, 10.6, -11.2, 4.5, 11.1, -6.1, 4.7, 12.8,
                      -1.0, 6.6, 11.3, -3.6, 5.1, 1.0, -8.2, 3.9, 14.5, -0.5, 5.7,
                      11.9, -2.0, 5.1, 8.1, -1.6, 5.2, 15.5, -0.7, 4.9, 12.4, -0.8,
                      5.2, 11.1, -16.8, 5.1, 5.1, -5.1, 4.6, 4.8, -9.5, 3.9, 13.2,
                      -0.7, 6.0, 9.9, -3.3, 4.9, 12.5, -13.6, 4.1, 8.9, -10.0,
                      4.9, 10.8, -13.5, 5.1), ncol = 3, byrow = TRUE)
colnames(data.train) <- c("inp.1", "inp.2", "out.1")

data.fit <- matrix(c(10.5, -0.9, 5.2, 5.8, -2.8, 5.6, 8.5, -0.2, 5.3, 13.8, -11.9,
                     3.7, 9.8, -1.2, 4.8, 11.0, -14.3, 4.4, 4.2, -17.0, 5.1, 6.9,
                     -3.3, 5.1, 13.2, -1.9, 4.6), ncol = 3, byrow = TRUE)

data.test <- matrix(c(10.5, -0.9, 5.8, -2.8, 8.5, -0.2, 13.8, -11.9, 9.8, -1.2, 11.0,
                     -14.3, 4.2, -17.0, 6.9, -3.3, 13.2, -1.9), ncol = 2, byrow = TRUE)

range.data<-matrix(c(0.9, 15.6, -29, -0.2, 3, 6.6), ncol=3, byrow = FALSE)

#############################################################
## I.1 Example: Implementation of Wang & Mendel
#############################################################
method.type <- "WM"

## collect control parameters into a list
## num.labels = 3 means we define 3 as the number of fuzzy terms
## type.mf = 3 means we use Gaussian as membership function
control.WM <- list(num.labels = 3, type.mf = 3, name = "Sim-0")

## generate the model and save it as object.WM
object.WM <- frbs.learn(data.train, range.data, method.type, control.WM)

#############################################################
## I.2 Example: Implementation of SBC
#############################################################
method.type <- "SBC"
control.SBC <- list(r.a = 0.5, eps.high = 0.5, eps.low = 0.15, name = "Sim-0")

object.SBC <- frbs.learn(data.train, range.data, method.type, control.SBC)

#############################################################
## I.3 Example: Implementation of HYFIS
#############################################################
method.type <- "HYFIS"

control.HYFIS <- list(num.labels = 5, max.iter = 50, step.size = 0.01,
                 name = "Sim-0")

object.HYFIS <- frbs.learn(data.train, range.data, method.type, control.HYFIS)

#############################################################
## I.4 Example: Implementation of ANFIS
#############################################################
method.type <- "ANFIS"

control.ANFIS <- list(num.labels = 5, max.iter = 100, step.size = 0.01,
                      name = "Sim-0")

object.ANFIS <- frbs.learn(data.train, range.data, method.type, control.ANFIS)

#############################################################
## I.5 Example: Implementation of DENFIS
#############################################################

control.DENFIS <- list(Dthr = 0.1, max.iter = 100, step.size = 0.001, d = 2,
                       name = "Sim-0")
method.type <- "DENFIS"

object.DENFIS <- frbs.learn(data.train, range.data, method.type, control.DENFIS)

#############################################################
## I.6 Example: Implementation of FIR.DM
#############################################################
method.type <- "FIR.DM"

control.DM <- list(num.labels = 5, max.iter = 100, step.size = 0.01, name = "Sim-0")
object.DM <- frbs.learn(data.train, range.data, method.type, control.DM)

#############################################################
## I.7 Example: Implementation of FS.HGD
#############################################################
method.type <- "FS.HGD"

control.HGD <- list(num.labels = 5, max.iter = 100, step.size = 0.01,
               alpha.heuristic = 1, name = "Sim-0")
object.HGD <- frbs.learn(data.train, range.data, method.type, control.HGD)

#############################################################
## I.8 Example: Implementation of GFS.FR.MOGUL
#############################################################
method.type <- "GFS.FR.MOGUL"

control.GFS.FR.MOGUL <- list(persen_cross = 0.6,
                    max.iter = 20, max.gen = 10, max.tune = 10, persen_mutant = 0.3,
                    epsilon = 0.8, name="sim-0")
object.GFS.FR.MOGUL <- frbs.learn(data.train, range.data,
                       method.type, control.GFS.FR.MOGUL)

#############################################################
## I.9 Example: Implementation of Thrift's method (GFS.THRIFT)
#############################################################
method.type <- "GFS.THRIFT"

control.Thrift <- list(popu.size = 15, num.labels = 3, persen_cross = 1,
                    max.gen = 5, persen_mutant = 1,
                    name="sim-0")
object.Thrift <- frbs.learn(data.fit, range.data, method.type, control.Thrift)

#############################################################
## II. Classification Problem
#############################################################
## The iris dataset is shuffled and divided into training and
## testing data. Bad results in the predicted values may result
## from casual imbalanced classes in the training data.
## Take into account that the simulation may take a long time
## depending on the hardware you use.
## One may get better results with other parameters.
## Data are in data.frame or matrix form and the last column is
## the output variable/attribute
## The data must be expressed in numbers (numerical data).

data(iris)
irisShuffled <- iris[sample(nrow(iris)),]
irisShuffled[,5] <- unclass(irisShuffled[,5])
tra.iris <- irisShuffled[1:105,]
tst.iris <- irisShuffled[106:nrow(irisShuffled),1:4]
real.iris <- matrix(irisShuffled[106:nrow(irisShuffled),5], ncol = 1)

## Please take into account that the interval needed is the range of input data only.
range.data.input <- matrix(c(4.3, 7.9, 2.0, 4.4, 1.0, 6.9, 0.1, 2.5), nrow=2)

#########################################################
## II.1 Example: Implementation of FRBCS with weighted factor based on Ishibuchi's method
###############################################################
## generate the model
method.type <- "FRBCS.W"
control <- list(num.labels = 7, type.mf = 1)

object <- frbs.learn(tra.iris, range.data.input, method.type, control)

## conduct the prediction process
res.test <- predict(object, tst.iris)

#########################################################
## II.2 Example: Implementation of FRBCS based on Chi's method
###############################################################
## generate the model
method.type <- "FRBCS.CHI"
control <- list(num.labels = 7, type.mf = 1)

object <- frbs.learn(tra.iris, range.data.input, method.type, control)

## conduct the prediction process
res.test <- predict(object, tst.iris)

#########################################################
## II.3 The example: Implementation of GFS.GCCL
###############################################################
method.type <- "GFS.GCCL"

control <- list(popu.size = 30, num.class = 3, num.labels = 5, persen_cross = 0.9,
                    max.gen = 200, persen_mutant = 0.3,
                    name="sim-0")
## Training process
## The main result of the training is a rule database which is used later for prediction.
object <- frbs.learn(tra.iris, range.data.input, method.type, control)
## Prediction process
res.test <- predict(object, tst.iris)

#########################################################
## II.4 Example: Implementation of FH.GBML
###############################################################
method.type <- "FH.GBML"

	control <- list(popu.size = 10, max.num.rule = 50, num.class = 3,
				persen_cross = 0.9, max.gen = 200, persen_mutant = 0.3, p.dcare = 0.5,
             p.gccl = 1, name="sim-0")

	## Training process
	## The main result of the training is a rule database which is used later for prediction.
	object <- frbs.learn(tra.iris, range.data.input, method.type, control)

	## Prediction process
	res.test <- predict(object, tst.iris)

#########################################################
## II.5 The example: Implementation of SLAVE
###############################################################
method.type <- "SLAVE"

	control <- list(num.class = 3, num.labels = 5,
				persen_cross = 0.9, max.iter = 50, max.gen = 30, persen_mutant = 0.3,
             k.lower = 0.25, k.upper = 0.75, epsilon = 0.1, name="sim-0")

	## Training process
	## The main result of the training is a rule database which is used later for prediction.
	object <- frbs.learn(tra.iris, range.data.input, method.type, control)

	## Prediction process
	res.test <- predict(object, tst.iris)

Run the code above in your browser using DataLab