Learn R Programming

⚠️There's a newer version (1.1.5) of this package.Take me there.

GMKMcharlie (version 1.0.3)

Unsupervised Gaussian Mixture and Minkowski K-Means

Description

High performance trainers for parameterizing and clustering weighted data. The Gaussian mixture (GM) module includes the conventional EM (expectation maximization) trainer, the component-wise EM trainer, the minimum-message-length EM trainer by Figueiredo and Jain (2002) . These trainers accept additional constraints on mixture weights and covariance eigen ratios. The K-means (KM) module offers clustering with the options of (i) deterministic and stochastic K-means++ initializations, (ii) upper bounds on cluster weights (sizes), (iii) Minkowski distances, (iv) cosine dissimilarity, (v) dense and sparse representation of data input. The package improved the usual implementations of GM and KM training algorithms in various aspects. It is carefully crafted in multithreaded C++ for processing large data in industry use.

Copy Link

Version

Install

install.packages('GMKMcharlie')

Monthly Downloads

180

Version

1.0.3

License

GPL-3

Maintainer

Charlie Wusuo Liu

Last Published

October 8th, 2019

Functions in GMKMcharlie (1.0.3)

GM

Multithreaded Gaussian mixture trainer
KMsparse

K-means over sparse representation of data
KMconstrained

K-means over dense data input with constraints on cluster weights
KMppIniSparse

Minkowski and spherical, deterministic and stochastic, multithreaded K-means++ initialization over sparse representation of data
GMcw

Multithreaded component-wise Gaussian mixture trainer
KMppIni

Minkowski and spherical, deterministic and stochastic, multithreaded K-means++ initialization over dense representation of data
d2s

Dense to sparse conversion
KMconstrainedSparse

K-means over sparse data input with constraints on cluster weights
GMfj

Multithreaded minimum message length Gaussian mixture trainer
s2d

Sparse to dense conversion
KM

K-means over dense representation of data