Performs hierarchical clustering through dist and hclust. So far it is mainly
a wrapper around these two functions, plus plotting using the dendextend
package facilities.
CLUST(x, ...)# S3 method for default
CLUST(x, ...)
# S3 method for Coe
CLUST(
x,
fac,
type = c("horizontal", "vertical", "fan")[1],
k,
dist_method = "euclidean",
hclust_method = "complete",
retain = 0.99,
labels,
lwd = 1/4,
cex = 1/2,
palette = pal_qual,
...
)
a ggplot
plot
a Coe or PCA object
useless here
factor specification for fac_dispatcher
character
one of c("horizontal", "vertical", "fan")
(default: horizontal
)
numeric
if provided and greater than 1, cut the tree into this number of groups
to feed dist's method
argument, that is one of
euclidean
(default), maximum
, manhattan
, canberra
, binary
or minkowski
.
to feed hclust's method
argument, one of
ward.D
, ward.D2
, single
, complete
(default), average
, mcquitty
, median
or centroid
.
number of axis to retain if a PCA object is passed. If a number < 1 is passed, then the number of PCs retained will be enough to capture this proportion of variance via scree_min
factor specification for labelling tips and to feed fac_dispatcher
for branches (default: 0.25
)
for labels (default: 1
)
one of available palettes
Other multivariate:
KMEANS()
,
KMEDOIDS()
,
LDA()
,
MANOVA_PW()
,
MANOVA()
,
MDS()
,
MSHAPES()
,
NMDS()
,
PCA()
,
classification_metrics()
# On Coe
bf <- bot %>% efourier(6)
CLUST(bf)
# with a factor and vertical
CLUST(bf, ~type, "v")
# with some cutting and different dist/hclust methods
CLUST(bf,
dist_method="maximum", hclust_method="average",
labels=~type, k=3, lwd=1, cex=1, palette=pal_manual(c("green", "yellow", "red")))
# On PCA
bf %>% PCA %>% CLUST
Run the code above in your browser using DataLab