core
, lad
, or pfc
to estimate a sufficient dimension reduction subspace using covariance reducing models (CORE), likelihood acquired directions (LAD), or principal fitted components (PFC).
ldr(X, y = NULL, fy = NULL, Sigmas = NULL, ns = NULL, numdir = NULL, nslices = NULL, model = c("core", "lad", "pfc"), numdir.test = FALSE, ...)
n
rows of observations and p
columns of predictors. The predictors are assumed to have a continuous distribution.n
. It can be continuous or categorical.bf
or defined by the user. It is a function of y
alone and has independent column vectors. It is used exclusively with pfc
. See bf
for detail.core
.pfc
, the dimension numdir
must be less than or equal to the minimum of p
and r
, where r
is the number of columns of fy
. When calling lad
and y
is continuous, numdir
is the number of slices to use.lad
."pfc"
, "lad"
, "core"
.FALSE
, the chosen model fits with the provided numdir
. If TRUE
, the model is fit for all dimensions less or equal to numdir
.core
, lad
, or pfc
. The output depends on the model used. See pfc
, lad
, and core
for further detail.For CORE, given a set of $h$ covariance matrices, the goal is to find a sufficient reduction that accounts for the heterogeneity among the population covariance matrices. See the documentation of "core"
for details.
For PFC, $\mu_y=\mu + \Gamma \beta f_y$, with various structures of $\Delta$. The simplest is the isotropic ("iso"
) with $\Delta=\delta^2 I_p$. The anisotropic ("aniso"
) PFC model assumes that $\Delta=\mathrm{diag}(\delta_1^2, ..., \delta_p^2)$, where the conditional predictors are independent and on different measurement scales. The unstructured ("unstr"
) PFC model allows a general structure for $\Delta$. Extended structures are considered. See the help file of pfc
for more detail.
LAD assumes that the response $Y$ is discrete. A continuous response is sliced into finite categories to meet this condition. It estimates the central subspace $\mathcal{S}_{Y|X}$ by modeling both $\mu_y$ and $\Delta_y$. See lad
for more detail.
Cook, RD (2007): Fisher Lecture - Dimension Reduction in Regression (with discussion). Statistical Science, 22, 1--26.
Cook, R. D. and Forzani, L. (2008a). Covariance reducing models: An alternative to spectral modelling of covariance matrices. Biometrika 95, 799-812.
Cook, R. D. and Forzani, L. (2008b). Principal fitted components for dimension reduction in regression. Statistical Science 23, 485--501.
Cook, R. D. and Forzani, L. (2009). Likelihood-based sufficient dimension reduction. Journal of the American Statistical Association, Vol. 104, 485, pp 197--208.
pfc
, lad
, core
data(bigmac)
fit1 <- ldr(X=bigmac[,-1], y=bigmac[,1], fy=bf(y=bigmac[,1], case="pdisc",
degree=0, nslices=5), numdir=3, structure="unstr", model="pfc")
summary(fit1)
plot(fit1)
fit2 <- ldr(X=bigmac[,-1], y=bigmac[,1], fy=bf(y=bigmac[,1], case="poly",
degree=2), numdir=2, structure="aniso", model="pfc")
summary(fit2)
plot(fit2)
fit3 <- ldr(X=as.matrix(bigmac[,-1]), y=bigmac[,1], model="lad", nslices=5)
summary(fit3)
plot(fit3)
Run the code above in your browser using DataLab