Sunil Arya

Sunil Arya

5 packages on CRAN

dbscan

cran
99.99th

Percentile

A fast reimplementation of several density-based algorithms of the DBSCAN family for spatial data. Includes the DBSCAN (density-based spatial clustering of applications with noise) and OPTICS (ordering points to identify the clustering structure) clustering algorithms HDBSCAN (hierarchical DBSCAN) and the LOF (local outlier factor) algorithm. The implementations use the kd-tree data structure (from library ANN) for faster k-nearest neighbor search. An R interface to fast kNN and fixed-radius NN search is also provided.

FNN

cran
99.99th

Percentile

Cover-tree and kd-tree fast k-nearest neighbor search algorithms and related applications including KNN classification, regression and information measures are implemented.

RANN

cran
99.99th

Percentile

Finds the k nearest neighbours for every point in a given dataset in O(N log N) time using Arya and Mount's ANN library (v1.1.3). There is support for approximate as well as exact searches, fixed radius searches and 'bd' as well as 'kd' trees. The distance is computed using the L2 (Euclidean) metric. Please see package 'RANN.L1' for the same functionality using the L1 (Manhattan, taxicab) metric.

RANN.L1

cran
99.99th

Percentile

Finds the k nearest neighbours for every point in a given dataset in O(N log N) time using Arya and Mount's ANN library (v1.1.3). There is support for approximate as well as exact searches, fixed radius searches and 'bd' as well as 'kd' trees. The distance is computed using the L1 (Manhattan, taxicab) metric. Please see package 'RANN' for the same functionality using the L2 (Euclidean) metric.

99.99th

Percentile

Estimates the transfer entropy from one time series to another, where each time series consists of continuous random variables. The transfer entropy is an extension of mutual information which takes into account the direction of information flow, under the assumption that the underlying processes can be described by a Markov model. Two estimation methods are provided. The first calculates transfer entropy as the difference of mutual information. Mutual information is estimated using the Kraskov method, which builds on a nearest-neighbor framework (see package references). The second estimation method estimate transfer entropy via the a generalized correlation sum.