This function implements a real-time version of principal SDR based on least squares SVM loss. It is intended for streaming or sequential data settings where new observations arrive continuously and re-fitting the full SDR model would be computationally expensive.
After an initial psdr or rtpsdr fit is obtained, this function updates the working matrix M, slice statistics, and eigen-decomposition efficiently using only the new batch of data. The method supports both regression and binary classification, automatically choosing the appropriate LS-SVM variant.
The returned object includes cumulative sample size, updated mean vector, slice coefficients, intermediate matrices required for updates, and the resulting central subspace basis.
rtpsdr(x, y, obj = NULL, h = 10, lambda = 1)An object of class c("rtpsdr","psdr") containing:
x, y: latest batch data
M: working matrix
evalues, evectors: eigen-decomposition of M (central subspace basis)
N: cumulative sample size
Xbar: cumulative mean vector
r: slice-specific coefficient matrix
A: new A part for update. See Artemiou et. al., (2021)
loss: "lssvm" (continuous) or "wlssvm" (binary)
fit: metadata (mode="realtime", H, cutpoints, weight_cutpoints, lambda, etc.)
x in new data
y in new data, y is continuous
the latest output object from the rtpsdr
unified control for slicing or weighting; accepts either an integer or a numeric vector.
hyperparameter for the loss function. default is set to 1.
Jungmin Shin, c16267@gmail.com, Seung Jun Shin, sjshin@korea.ac.kr, Andreas Artemiou artemiou@uol.ac.cy
Artemiou, A. and Dong, Y. (2016)
Sufficient dimension reduction via principal lq support vector machine,
Electronic Journal of Statistics 10: 783–805.
Artemiou, A., Dong, Y. and Shin, S. J. (2021)
Real-time sufficient dimension reduction through principal least
squares support vector machines, Pattern Recognition 112: 107768.
Kim, B. and Shin, S. J. (2019)
Principal weighted logistic regression for sufficient dimension
reduction in binary classification, Journal of the Korean Statistical Society 48(2): 194–206.
Li, B., Artemiou, A. and Li, L. (2011)
Principal support vector machines for linear and
nonlinear sufficient dimension reduction, Annals of Statistics 39(6): 3182–3210.
Soale, A.-N. and Dong, Y. (2022)
On sufficient dimension reduction via principal asymmetric
least squares, Journal of Nonparametric Statistics 34(1): 77–94.
Wang, C., Shin, S. J. and Wu, Y. (2018)
Principal quantile regression for sufficient dimension
reduction with heteroscedasticity, Electronic Journal of Statistics 12(2): 2114–2140.
Shin, S. J., Wu, Y., Zhang, H. H. and Liu, Y. (2017)
Principal weighted support vector machines for sufficient dimension reduction in
binary classification, Biometrika 104(1): 67–81.
Li, L. (2007)
Sparse sufficient dimension reduction, Biometrika 94(3): 603–613.
psdr, npsdr
# \donttest{
set.seed(1)
p <- 5; m <- 300; B <- 3
obj <- NULL
for (b in 1:B) {
x <- matrix(rnorm(m*p), m, p)
y <- x[,1]/(0.5+(x[,2]+1)^2) + 0.2*rnorm(m)
obj <- rtpsdr(x, y, obj=obj, h=8, lambda=1)
}
print(obj)
summary(obj)
# }
Run the code above in your browser using DataLab