CMA (version 1.30.0)

pnnCMA: Probabilistic Neural Networks

Description

Probabilistic Neural Networks is the term Specht (1990) used for a Gaussian kernel estimator for the conditional class densities. For S4 method information, see pnnCMA-methods.

Usage

pnnCMA(X, y, f, learnind, sigma = 1,models=FALSE)

Arguments

X
Gene expression data. Can be one of the following:
  • Amatrix. Rows correspond to observations, columns to variables.
  • Adata.frame, whenfisnotmissing (s. below).
  • An object of classExpressionSet. Each variable (gene) will be scaled for unit variance and zero mean.
y
Class labels. Can be one of the following:
  • Anumericvector.
  • Afactor.
  • AcharacterifXis anExpressionSetthat specifies the phenotype variable.
  • missing, ifXis adata.frameand a proper formulafis provided.
WARNING: The class labels will be re-coded to range from 0 to K-1, where K is the total number of different classes in the learning set.
f
A two-sided formula, if X is a data.frame. The left part correspond to class labels, the right to variables.
learnind
An index vector specifying the observations that belong to the learning set. For this method, this must not be missing.
sigma
Standard deviation of the Gaussian Kernel used. This hyperparameter should be tuned, s. tune. The default is 1, but this generally does not lead to good results. Actually, this method reacts very sensitively to the value of sigma. Take care if warnings appear related to the particular choice.
models
a logical value indicating whether the model object shall be returned

Value

References

Specht, D.F. (1990). Probabilistic Neural Networks. Neural Networks, 3, 109-118.

See Also

compBoostCMA, dldaCMA, ElasticNetCMA, fdaCMA, flexdaCMA, gbmCMA, knnCMA, ldaCMA, LassoCMA, nnetCMA, pknnCMA, plrCMA, pls_ldaCMA, pls_lrCMA, pls_rfCMA, qdaCMA, rfCMA, scdaCMA, shrinkldaCMA, svmCMA

Examples

Run this code
### load Golub AML/ALL data
data(golub)
### extract class labels
golubY <- golub[,1]
### extract gene expression from first 10 genes
golubX <- as.matrix(golub[,2:11])
### select learningset
ratio <- 2/3
set.seed(111)
learnind <- sample(length(golubY), size=floor(ratio*length(golubY)))
### run PNN
pnnresult <- pnnCMA(X=golubX, y=golubY, learnind=learnind, sigma = 3)
### show results
show(pnnresult)
ftable(pnnresult)
plot(pnnresult)

Run the code above in your browser using DataLab