Learn R Programming

mclust (version 5.2.2)

plot.MclustDA: Plotting method for MclustDA discriminant analysis

Description

Graphical tools for training and test data, known training data classification, mclustDA test data classification, and/or training errors.

Usage

"plot"(x, what = c("scatterplot", "classification", "train&test", "error"), newdata, newclass, dimens, symbols, colors, ...)

Arguments

x
An object of class 'MclustDA' resulting from a call to MclustDA.
what
The type of graph requested:

newdata
A data frame or matrix for test data.
newclass
A vector giving the class labels for the observations in the test data (if known).
dimens
A vector of integers giving the dimensions of the desired coordinate projections for multivariate data. The default is to take all the the available dimensions for plotting.
symbols
Either an integer or character vector assigning a plotting symbol to each unique class. Elements in colors correspond to classes in order of appearance in the sequence of observations (the order used by the function factor). The default is given by mclust.options("classPlotSymbols").
colors
Either an integer or character vector assigning a color to each unique class in classification. Elements in colors correspond to classes in order of appearance in the sequence of observations (the order used by the function factor). The default is given by mclust.options("classPlotColors").
...
further arguments passed to or from other methods.

References

C. Fraley and A. E. Raftery (2002). Model-based clustering, discriminant analysis, and density estimation. Journal of the American Statistical Association 97:611-631. C. Fraley, A. E. Raftery, T. B. Murphy and L. Scrucca (2012). mclust Version 4 for R: Normal Mixture Modeling for Model-Based Clustering, Classification, and Density Estimation. Technical Report No. 597, Department of Statistics, University of Washington.

Details

For more flexibility in plotting, use mclust1Dplot, mclust2Dplot, surfacePlot, coordProj, or randProj.

See Also

MclustDA, surfacePlot, coordProj, randProj

Examples

Run this code
## Not run: 
# odd <- seq(from = 1, to = nrow(iris), by = 2)
# even <- odd + 1
# X.train <- iris[odd,-5]
# Class.train <- iris[odd,5]
# X.test <- iris[even,-5]
# Class.test <- iris[even,5]
# 
# # common EEE covariance structure (which is essentially equivalent to linear discriminant analysis)
# irisMclustDA <- MclustDA(X.train, Class.train, modelType = "EDDA", modelNames = "EEE")
# summary(irisMclustDA, parameters = TRUE)
# summary(irisMclustDA, newdata = X.test, newclass = Class.test)
# 
# # common covariance structure selected by BIC
# irisMclustDA <- MclustDA(X.train, Class.train, modelType = "EDDA")
# summary(irisMclustDA, parameters = TRUE)
# summary(irisMclustDA, newdata = X.test, newclass = Class.test)
# 
# # general covariance structure selected by BIC
# irisMclustDA <- MclustDA(X.train, Class.train)
# summary(irisMclustDA, parameters = TRUE)
# summary(irisMclustDA, newdata = X.test, newclass = Class.test)
# 
# plot(irisMclustDA)
# plot(irisMclustDA, dimens = 3:4)
# plot(irisMclustDA, dimens = 4)
# 
# plot(irisMclustDA, what = "classification")
# plot(irisMclustDA, what = "classification", newdata = X.test)
# plot(irisMclustDA, what = "classification", dimens = 3:4)
# plot(irisMclustDA, what = "classification", newdata = X.test, dimens = 3:4)
# plot(irisMclustDA, what = "classification", dimens = 4)
# plot(irisMclustDA, what = "classification", dimens = 4, newdata = X.test)
# 
# plot(irisMclustDA, what = "train&test", newdata = X.test)
# plot(irisMclustDA, what = "train&test", newdata = X.test, dimens = 3:4)
# plot(irisMclustDA, what = "train&test", newdata = X.test, dimens = 4)
# 
# plot(irisMclustDA, what = "error")
# plot(irisMclustDA, what = "error", dimens = 3:4)
# plot(irisMclustDA, what = "error", dimens = 4)
# plot(irisMclustDA, what = "error", newdata = X.test, newclass = Class.test)
# plot(irisMclustDA, what = "error", newdata = X.test, newclass = Class.test, dimens = 3:4)
# plot(irisMclustDA, what = "error", newdata = X.test, newclass = Class.test, dimens = 4)
# 
# # simulated 1D data
# n <- 250 
# set.seed(1)
# triModal <- c(rnorm(n,-5), rnorm(n,0), rnorm(n,5))
# triClass <- c(rep(1,n), rep(2,n), rep(3,n))
# odd <- seq(from = 1, to = length(triModal), by = 2)
# even <- odd + 1
# triMclustDA <- MclustDA(triModal[odd], triClass[odd])
# summary(triMclustDA, parameters = TRUE)
# summary(triMclustDA, newdata = triModal[even], newclass = triClass[even])
# plot(triMclustDA)
# plot(triMclustDA, what = "classification")
# plot(triMclustDA, what = "classification", newdata = triModal[even])
# plot(triMclustDA, what = "train&test", newdata = triModal[even])
# plot(triMclustDA, what = "error")
# plot(triMclustDA, what = "error", newdata = triModal[even], newclass = triClass[even])
# 
# # simulated 2D cross data
# data(cross)
# odd <- seq(from = 1, to = nrow(cross), by = 2)
# even <- odd + 1
# crossMclustDA <- MclustDA(cross[odd,-1], cross[odd,1])
# summary(crossMclustDA, parameters = TRUE)
# summary(crossMclustDA, newdata = cross[even,-1], newclass = cross[even,1])
# plot(crossMclustDA)
# plot(crossMclustDA, what = "classification")
# plot(crossMclustDA, what = "classification", newdata = cross[even,-1])
# plot(crossMclustDA, what = "train&test", newdata = cross[even,-1])
# plot(crossMclustDA, what = "error")
# plot(crossMclustDA, what = "error", newdata =cross[even,-1], newclass = cross[even,1])
# ## End(Not run)

Run the code above in your browser using DataLab