
svm
from the package e1071
that provides an interface to the award-winning LIBSVM routines.
For S4
method information, see svmCMA-methods
svmCMA(X, y, f, learnind, probability, models=FALSE,seed=341,...)
matrix
. Rows correspond to observations, columns to variables.
data.frame
, when f
is not missing (s. below).
ExpressionSet
.
numeric
vector.
factor
.
character
if X
is an ExpressionSet
that
specifies the phenotype variable.
missing
, if X
is a data.frame
and a
proper formula f
is provided.
WARNING: The class labels will be re-coded to
range from 0
to K-1
, where K
is the
total number of different classes in the learning set.
X
is a data.frame
. The
left part correspond to class labels, the right to variables.missing
;
in that case, the learning set consists of all
observations and predictions are made on the
learning set.svm
from the
package e1071
Chang, Chih-Chung and Lin, Chih-Jen : LIBSVM: a library for Support Vector Machines http://www.csie.ntu.edu.tw/~cjlin/libsvm Schoelkopf, B., Smola, A.J. (2002) Learning with kernels. MIT Press, Cambridge, MA.
compBoostCMA
, dldaCMA
, ElasticNetCMA
,
fdaCMA
, flexdaCMA
, gbmCMA
,
knnCMA
, ldaCMA
, LassoCMA
,
nnetCMA
, pknnCMA
, plrCMA
,
pls_ldaCMA
, pls_lrCMA
, pls_rfCMA
,
pnnCMA
, qdaCMA
, rfCMA
,
scdaCMA
, shrinkldaCMA
### load Golub AML/ALL data
data(golub)
### extract class labels
golubY <- golub[,1]
### extract gene expression
golubX <- as.matrix(golub[,-1])
### select learningset
ratio <- 2/3
set.seed(111)
learnind <- sample(length(golubY), size=floor(ratio*length(golubY)))
### run _untuned_linear SVM
svmresult <- svmCMA(X=golubX, y=golubY, learnind=learnind,probability=TRUE)
### show results
show(svmresult)
ftable(svmresult)
plot(svmresult)
Run the code above in your browser using DataLab