# kpca

0th

Percentile

##### Kernel Principal Components Analysis

Kernel Principal Components Analysis is a nonlinear form of principal component analysis.

Keywords
cluster
##### Usage
## S3 method for class 'formula':
kpca(x, data = NULL, na.action, ...)## S3 method for class 'matrix':
kpca(x, kernel = "rbfdot", kpar = list(sigma = 0.1), features = 0,
th = 1e-4, na.action = na.omit, ...)## S3 method for class 'kernelMatrix':
kpca(x, features = 0, th = 1e-4, ...)## S3 method for class 'list':
kpca(x, kernel = "stringdot", kpar = list(length = 4, lambda = 0.5), features = 0, th = 1e-4, na.action = na.omit, ...)
##### Arguments
x
the data matrix indexed by row or a formula describing the model, or a kernel Matrix of class kernelMatrix, or a list of character vectors
data
an optional data frame containing the variables in the model (when using a formula).
kernel
the kernel function used in training and predicting. This parameter can be set to any function, of class kernel, which computes a dot product between two vector arguments. kernlab provides the most popular kernel functions which can be used by
kpar
the list of hyper-parameters (kernel parameters). This is a list which contains the parameters to be used with the kernel function. Valid parameters for existing kernels are :
• sigmainverse kernel width for the Radial Basis
features
Number of features (principal components) to return. (default: 0 , all)
th
the value of the eigenvalue under which principal components are ignored (only valid when features = 0). (default : 0.0001)
na.action
A function to specify the action to be taken if NAs are found. The default action is na.omit, which leads to rejection of cases with missing values on any required variable. An alternative is na.fail<
...
##### Details

Using kernel functions one can efficiently compute principal components in high-dimensional feature spaces, related to input space by some non-linear map. The data can be passed to the kpca function in a matrix or a data.frame, in addition kpca also supports input in the form of a kernel matrix of class kernelMatrix or as a list of character vectors where a string kernel has to be used.

##### Value

• An S4 object containing the principal component vectors along with the corresponding eigenvalues.
• pcva matrix containing the principal component vectors (column wise)
• eigThe corresponding eigenvalues
• rotatedThe original data projected (rotated) on the principal components
• xmatrixThe original data matrix
• all the slots of the object can be accessed by accessor functions.

##### Note

The predict function can be used to embed new data on the new space

##### References

Schoelkopf B., A. Smola, K.-R. Mueller : Nonlinear component analysis as a kernel eigenvalue problem Neural Computation 10, 1299-1319 http://mlg.anu.edu.au/~smola/papers/SchSmoMul98.pdf

kcca, pca

##### Aliases
• kpca
• kpca,formula-method
• kpca,matrix-method
• kpca,kernelMatrix-method
• kpca,list-method
• predict,kpca-method
##### Examples
# another example using the iris
data(iris)
test <- sample(1:150,20)

kpc <- kpca(~.,data=iris[-test,-5],kernel="rbfdot",kpar=list(sigma=0.2),features=2)

#print the principal component vectors
pcv(kpc)

#plot the data projection on the components
plot(rotated(kpc),col=as.integer(iris[-test,5]),xlab="1st Principal Component",ylab="2nd Principal Component")

#embed remaining points
emb <- predict(kpc,iris[test,-5])
points(emb,col=as.integer(iris[test,5]))
Documentation reproduced from package kernlab, version 0.9-13, License: GPL-2

### Community examples

Looks like there are no examples yet.