Learn R Programming

bmrm (version 3.3)

softMarginVectorLoss: Soft Margin Vector Loss function for multiclass SVM

Description

Soft Margin Vector Loss function for multiclass SVM

Usage

softMarginVectorLoss(x, y, l = 1 - table(seq_along(y), y))

Arguments

x

instance matrix, where x(t,) defines the features of instance t

y

target vector where y(t) is an integer encoding target of x(t,)

l

loss matrix. l(t,p(t)) must be the loss for predicting target p(t) instead of y(t) for instance t. By default, the parameter is set to character value "0/1" so that the loss is set to a 0/1 loss matrix.

Value

a function taking one argument w and computing the loss value and the gradient at point w

References

Teo et al. A Scalable Modular Convex Solver for Regularized Risk Minimization. KDD 2007

Examples

Run this code
# NOT RUN {
  # -- Build a 2D dataset from iris, and add an intercept
  x <- cbind(intercept=100,data.matrix(iris[c(1,2)]))
  y <- iris$Species
  
  # -- build the multiclass SVM model
  w <- bmrm(softMarginVectorLoss(x,y))
  dim(w) <- c(ncol(x),nlevels(y))
  dimnames(w) <- list(colnames(x),levels(y))
  F <- x %*% w
  pred <- colnames(F)[max.col(F)]
  table(pred,y)
  
  # -- Plot the dataset, the decision boundaries, the convergence curve, and the predictions
  gx <- seq(min(x[,2]),max(x[,2]),length=200) # positions of the probes on x-axis
  gy <- seq(min(x[,3]),max(x[,3]),length=200) # positions of the probes on y-axis
  Y <- outer(gx,gy,function(a,b){
     max.col(cbind(100,a,b) %*% w)
  })
  layout(matrix(c(1,3,2,3),2,2))
  image(gx,gy,Y,asp=1,main="dataset & decision boundaries",xlab=colnames(x)[1],ylab=colnames(x)[2])
  points(x[,-1],pch=19+as.integer(y))
  plot(attr(w,"log")$epsilon,type="o",ylab="epsilon gap",xlab="iteration")
  plot(row(F),F,pch=19+col(F),ylab="prediction values",xlab="sample")
# }

Run the code above in your browser using DataLab