Learn R Programming

isoboost (version 1.0.1)

asilb: (Adjacent-categories) Simple Isotonic LogitBoost

Description

Train and predict logitboost-based classification algorithm using isotonic regression (decision stumps for no monotone features) as weak learners, based on the adjacent-categories logistic model (see Agresti (2010)). For full details on this algorithm, see Conde et al. (2020).

Usage

asilb(xlearn, …)

# S3 method for formula asilb(formula, data, …)

# S3 method for default asilb(xlearn, ylearn, xtest = xlearn, mfinal = 100, monotone_constraints = rep(0, dim(xlearn)[2]), prior = NULL, …)

Arguments

formula

A formula of the form groups ~ x1 + x2 + .... That is, the response is the class variable and the right hand side specifies the explanatory variables.

data

Data frame from which variables specified in formula are to be taken.

xlearn

(Required if no formula is given as the principal argument.) A data frame or matrix containing the explanatory variables.

ylearn

(Required if no formula is given as the principal argument.) A numeric vector or factor with numeric levels specifying the class for each observation.

xtest

A data frame or matrix of cases to be classified, containing the features used in formula or xlearn.

mfinal

Number of iterations of the algorithm.

monotone_constraints

Numerical vector consisting of 1, 0 and -1, its length equals the number of features in xlearn. 1 is increasing, -1 is decreasing and 0 is no constraint.

prior

The prior probabilities of class membership. If unspecified, equal prior probabilities are used. If present, the probabilities must be specified in the order of the factor levels.

Arguments passed to or from other methods.

Value

A list containing the following components:

call

The (matched) function call.

trainset

Matrix with the training set used (first columns) and the class for each observation (last column).

prior

Prior probabilities of class membership used.

apparent

Apparent error rate.

mfinal

Number of iterations of the algorithm.

loglikelihood

Log-likelihood.

posterior

Posterior probabilities of class membership for xtest set.

class

Labels of the class with maximal probability for xtest set.

References

Agresti, A. (2010). Analysis of Ordinal Categorical Data, 2nd edition. John Wiley and Sons. New Jersey.

Conde, D., Fernandez, M. A., Rueda, C., and Salvador, B. (2020). Isotonic boosting classification rules. Advances in Data Analysis and Classification, 1-25.

See Also

amilb, csilb, cmilb

Examples

Run this code
# NOT RUN {
data(motors)
table(motors$condition)
##  1  2  3  4 
## 83 67 70 60 

## Let us consider the first three variables as predictors
data <- motors[, 1:3]
grouping = motors$condition
## 
## Lower values of the amplitudes are expected to be 
## related to higher levels of damage severity, so 
## we can consider the following monotone constraints
monotone_constraints = rep(-1, 3)

set.seed(7964)
values <- runif(dim(data)[1])
trainsubset <- values < 0.2
obj <- asilb(data[trainsubset, ], grouping[trainsubset], 
               data[-trainsubset, ], 50, monotone_constraints)

## Apparent error
obj$apparent
## 4.761905

## Error rate
100*mean(obj$class != grouping[-trainsubset])
## 14.69534
# }

Run the code above in your browser using DataLab