dbscan (version 1.1-12)

frNN: Find the Fixed Radius Nearest Neighbors

Description

This function uses a kd-tree to find the fixed radius nearest neighbors (including distances) fast.

Usage

frNN(
  x,
  eps,
  query = NULL,
  sort = TRUE,
  search = "kdtree",
  bucketSize = 10,
  splitRule = "suggest",
  approx = 0
)

# S3 method for frNN sort(x, decreasing = FALSE, ...)

# S3 method for frNN adjacencylist(x, ...)

# S3 method for frNN print(x, ...)

Value

frNN() returns an object of class frNN (subclass of NN) containing a list with the following components:

id

a list of integer vectors. Each vector contains the ids of the fixed radius nearest neighbors.

dist

a list with distances (same structure as id).

eps

neighborhood radius eps that was used.

adjacencylist() returns a list with one entry per data point in x. Each entry contains the id of the nearest neighbors.

Arguments

x

a data matrix, a dist object or a frNN object.

eps

neighbors radius.

query

a data matrix with the points to query. If query is not specified, the NN for all the points in x is returned. If query is specified then x needs to be a data matrix.

sort

sort the neighbors by distance? This is expensive and can be done later using sort().

search

nearest neighbor search strategy (one of "kdtree", "linear" or "dist").

bucketSize

max size of the kd-tree leafs.

splitRule

rule to split the kd-tree. One of "STD", "MIDPT", "FAIR", "SL_MIDPT", "SL_FAIR" or "SUGGEST" (SL stands for sliding). "SUGGEST" uses ANNs best guess.

approx

use approximate nearest neighbors. All NN up to a distance of a factor of 1 + approx eps may be used. Some actual NN may be omitted leading to spurious clusters and noise points. However, the algorithm will enjoy a significant speedup.

decreasing

sort in decreasing order?

...

further arguments

Author

Michael Hahsler

Details

If x is specified as a data matrix, then Euclidean distances an fast nearest neighbor lookup using a kd-tree are used.

To create a frNN object from scratch, you need to supply at least the elements id with a list of integer vectors with the nearest neighbor ids for each point and eps (see below).

Self-matches: Self-matches are not returned!

References

David M. Mount and Sunil Arya (2010). ANN: A Library for Approximate Nearest Neighbor Searching, http://www.cs.umd.edu/~mount/ANN/.

See Also

Other NN functions: NN, comps(), kNNdist(), kNN(), sNN()

Examples

Run this code
data(iris)
x <- iris[, -5]

# Example 1: Find fixed radius nearest neighbors for each point
nn <- frNN(x, eps = .5)

# Number of neighbors
hist(sapply(adjacencylist(nn), length),
  xlab = "k", main="Number of Neighbors",
  sub = paste("Neighborhood size eps =", nn$eps))

# Explore neighbors of point i = 10
i <- 10
nn$id[[i]]
nn$dist[[i]]
plot(x, col = ifelse(1:nrow(iris) %in% nn$id[[i]], "red", "black"))

# get an adjacency list
head(adjacencylist(nn))

# plot the fixed radius neighbors (and then reduced to a radius of .3)
plot(nn, x)
plot(frNN(nn, eps = .3), x)

## Example 2: find fixed-radius NN for query points
q <- x[c(1,100),]
nn <- frNN(x, eps = .5, query = q)

plot(nn, x, col = "grey")
points(q, pch = 3, lwd = 2)

Run the code above in your browser using DataCamp Workspace