
Last chance! 50% off unlimited learning
Sale ends in
classif.DD(group,fdataobj,depth="FM",classif="glm",w,
par.classif=list(),par.depth=list(),
control=list(verbose=FALSE,draw=TRUE,col=NULL,alpha=.25))
data.frame
, fdata
or list
with the multivariate, functional or both covariates respectively.Details
.Details
.depth
argument, see Details
.depth
function.classif
procedure.verbose=TRUE
, report extra information on progress.
If draw=TRUE
print DD-plot of two samples based on data depth.
col
, the colors for points in DDfdataobj
w.r.t. each group
level.depth
function.classif
procedure.classif
method using the depth as covariate.depth
measure of the points infdataobj
w.r.t. a subsample of each g level
group and p data dimension ($G=g \times p$). The user can be specify the parameters for depth function inpar.depth
.
(i) Type of depth function from functional data, seeDepth
:"FM"
: Fraiman and Muniz depth."mode"
: h--modal depth.
% \item \code{"MB"}: Modified Band depth."RT"
: random Tukey depth."RP"
: random project depth."RPD"
: double random project depth.Depth.pfdata
:"FMp"
: Fraiman and Muniz depth with common support. Suppose that all p--fdata objects have the same support (same rangevals), seedepth.FMp
."modep"
: h--modal depth using a p--dimensional metric, seedepth.modep
.
%\code{\link{metric.dist}} function is used to compute the distances between the rows of a data matrix (as \code{\link{dist}} function."RPp"
: random project depth using a p--variate depth with the projections,
seedepth.RPp
."knn"
or"np"
classifier or"mode"
depth, the user must use a proper distance function:metric.lp
for functional data andmetric.dist
for multivariate data.
(iii) Type of depth function from multivariate data, seeDepth.Multivariate
:"SD"
: Simplicial depth (for bivariate data)."HS"
: Half-space depth."MhD"
: Mahalanobis dept."RD"
: random projections depth."LD"
: Likelihood depth."MaxD"
: Maximum depth."DD1"
: Search the best separating polynomial of degree 1."DD2"
: Search the best separating polynomial of degree 2.%Polynomial classifier with 2 degrees."DD3"
: Search the best separating polynomial of degree 3.%Polynomial classifier with 3 degrees."glm"
: Logistic regression is computed using Generalized Linear Models
classif.glm
."gam"
: Logistic regression is computed using Generalized Additive Models
classif.gsam
."lda"
: Linear Discriminant Analysis is computed usinglda
."qda"
: Quadratic Discriminant Analysis is computed usingqda
."knn"
: k-Nearest Neighbour classification is computed usingclassif.knn
."np"
: Non-parametric Kernel classifier is computed usingclassif.np
.par.classif
such as the smoothing parameterpar.classif[[``h'']]
, ifclassif="np"
or the k-Nearest Neighbourpar.classif[[``knn'']]
, ifclassif="knn"
.
In the case of polynomial classifier ("DD1"
,"DD2"
and"DD3"
) uses the original procedure proposed by Li et al. (2012), by defalut rotating the DD-plot (to exchange abscise and ordinate) using inpar.classif
argumentrotate=TRUE
. Notice that the maximum depth classifier can be considered as a particular case of DD1, fixing the slope with a value of 1 (par.classif=list(pol=1)
).
The number of possible different polynomials depends on the sample sizen
and increases polynomially with order$k$. In the case of$g$groups, so the procedure applies some multiple-start optimization scheme to save time:nmax=10000
, a random sample of10000
combinations.noptim
combinations in this random sample (by defaultnoptim=1
andtt=50/range(depth values)
). Note that Li et al. found that the optimization results become stable for$t \in [50, 200]$when the depth is standardized with upper bound 1.nmax=1000
) and that the procedure optimize the best (noptim=1
), but we recommended to repeat the last step for different solutions, as for examplenmax=250
andnoptim=25
. User can change the parameterspol
,rotate
,nmax
,noptim
andtt
in the argumentpar.classif
.
Theclassif.DD
procedure extends to multi-class problems by incorporating the method ofmajority votingin the case of polynomial classifier and the methodOne vs the Restin the logistic case ("glm"
and"gam"
).predict.classif.DD
# DD-classif for functional data
data(tecator)
ab=tecator$absorp.fdata
ab1=fdata.deriv(ab,nderiv=1)
ab2=fdata.deriv(ab,nderiv=2)
gfat=factor(as.numeric(tecator$y$Fat>=15))
# DD-classif for p=1 functional data set
out01=classif.DD(gfat,ab,depth="mode",classif="np")
out02=classif.DD(gfat,ab2,depth="mode",classif="np")
# DD-plot in gray scale
ctrl<-list(draw=T,col=gray(c(0,.5)),alpha=.2)
out02bis=classif.DD(gfat,ab2,depth="mode",classif="np",control=ctrl)
# 2 depth functions (same curves)
out03=classif.DD(gfat,list(ab2,ab2),depth=c("RP","mode"),classif="np")
# DD-classif for p=2 functional data set
ldata<-list("ab"=ab2,"ab2"=ab2)
# Weighted version
out04=classif.DD(gfat,ldata,depth="mode",classif="np",w=c(0.5,0.5))
# Model version
out05=classif.DD(gfat,ldata,depth="mode",classif="np")
# Integrated version (for multivariate functional data)
out06=classif.DD(gfat,ldata,depth="modep",classif="np")
# DD-classif for multivariate data
data(iris)
group<-iris[,5]
x<-iris[,1:4]
out10=classif.DD(group,x,depth="RP",classif="lda")
summary.classif(out10)
out11=classif.DD(group,list(x,x),depth=c("MhD","RP"),classif="lda")
summary.classif(out11)
# DD-classif for functional data: g levels
data(phoneme)
mlearn<-phoneme[["learn"]]
glearn<-as.numeric(phoneme[["classlearn"]])-1
out20=classif.DD(glearn,mlearn,depth="FM",classif="glm")
out21=classif.DD(glearn,list(mlearn,mlearn),depth=c("FM","RP"),classif="glm")
summary.classif(out20)
summary.classif(out21)
Run the code above in your browser using DataLab