liquidSVM (version 1.2.2)

nplSVM: Neyman-Pearson-Learning

Description

This routine provides binary classifiers that satisfy a predefined error rate on one type of error and that simlutaneously minimize the other type of error. For convenience some points on the ROC curve around the predefined error rate are returned. nplNPL performs Neyman-Pearson-Learning for classification.

Usage

nplSVM(x, y, ..., class = 1, constraint = 0.05, constraint.factors = c(3,
  4, 6, 9, 12)/6, do.select = TRUE)

Arguments

x

either a formula or the features

y

either the data or the labels corresponding to the features x. It can be a character in which case the data is loaded using liquidData. If it is of type liquidData then after training and selection the model is tested using the testing data (y$test).

...

configuration parameters, see Configuration. Can be threads=2, display=1, gpus=1, etc.

class

is the normal class (the other class becomes the alarm class)

constraint

gives the false alarm rate which should be achieved

constraint.factors

specifies the factors around constraint

do.select

if TRUE also does the whole selection for this model

Value

an object of type svm. Depending on the usage this object has also $train_errors, $select_errors, and $last_result properties.

Details

Please look at the demo-vignette (vignette('demo')) for more examples. The labels should only have value c(1,-1).

min_weight, max_weight, weight_steps: you might have to define which weighted classification problems will be considered. The choice is usually a bit tricky. Good luck ...

Examples

Run this code
# NOT RUN {
model <- nplSVM(Y ~ ., 'banana-bc', display=1)

## a worked example can be seen at
vignette("demo",package="liquidSVM")
# }
# NOT RUN {
 
# }

Run the code above in your browser using DataCamp Workspace