Kernel-Weighted Maximum Variance Projection (KMVP) is a generalization of
Maximum Variance Projection (MVP). Even though its name contains kernel, it is
not related to kernel trick well known in the machine learning community. Rather, it
generalizes the binary penalization on class discrepancy,
bandwidth
). Note that
when the bandwidth value is too small, it might suffer from numerical instability and rank deficiency due to its formulation.
do.kmvp(
X,
label,
ndim = 2,
preprocess = c("center", "scale", "cscale", "decorrelate", "whiten"),
bandwidth = 1
)
an
a length-
an integer-valued target dimension.
an additional option for preprocessing the data.
Default is "center". See also aux.preprocess
for more details.
bandwidth parameter for heat kernel as the equation above.
a named list containing
an
a list containing information for out-of-sample prediction.
a
zhang_maximum_2007Rdimtools
# NOT RUN {
## load iris data
data(iris)
X = as.matrix(iris[,1:4])
label = as.factor(iris$Species)
## perform KMVP with different bandwidths
out1 = do.kmvp(X, label, bandwidth=0.1)
out2 = do.kmvp(X, label, bandwidth=1)
out3 = do.kmvp(X, label, bandwidth=10)
## visualize
opar <- par(no.readonly=TRUE)
par(mfrow=c(1,3))
plot(out1$Y, main="bandwidth=0.1", col=label)
plot(out2$Y, main="bandwidth=1", col=label)
plot(out3$Y, main="bandwidth=10", col=label)
par(opar)
# }
# NOT RUN {
# }
Run the code above in your browser using DataLab