Computes the cumulative Jensen-Shannon distance between two samples.
cjs_dist(
x,
y,
x_weights = NULL,
y_weights = NULL,
metric = TRUE,
unsigned = TRUE,
...
)
distance value based on CJS computation.
numeric vector of draws from first distribution
numeric vector of draws from second distribution
numeric vector (same length as x) of weights for the draws of the first distribution
numeric vector (same length as y) of weights for the draws of the second distribution
Logical; if TRUE, return square-root of CJS. Default is TRUE
Logical; if TRUE then return max of CJS(P(x) || Q(x)) and CJS(P(-x) || Q(-x)). This ensures invariance to transformations such as PCA. Default is TRUE
unused
The Cumulative Jensen-Shannon distance is a symmetric metric based on the cumulative Jensen-Shannon divergence. The divergence CJS(P || Q) between two cumulative distribution functions P and Q is defined as:
$$CJS(P || Q) = \sum P(x) \log \frac{P(x)}{0.5 (P(x) + Q(x))} + \frac{1}{2 \ln 2} \sum (Q(x) - P(x))$$
The symmetric metric is defined as:
$$CJS_{dist}(P || Q) = \sqrt{CJS(P || Q) + CJS(Q || P)}$$
This has an upper bound of \(\sqrt{ \sum (P(x) + Q(x))}\)
Nguyen H-V., Vreeken J. (2015). Non-parametric
Jensen-Shannon Divergence. In: Appice A., Rodrigues P., Santos
Costa V., Gama J., Jorge A., Soares C. (eds) Machine Learning
and Knowledge Discovery in Databases. ECML PKDD 2015. Lecture
Notes in Computer Science, vol 9285. Springer, Cham.
doi:10.1007/978-3-319-23525-7_11
x <- rnorm(100)
y <- rnorm(100, 2, 2)
cjs_dist(x, y, x_weights = NULL, y_weights = NULL)
Run the code above in your browser using DataLab