Learn R Programming

refund (version 0.1-13)

fpca2s: Functional principal component analysis by a two-stage method

Description

This function performs functional PCA by performing an ordinary singular value decomposition on the functional data matrix, then smoothing the right singular vectors by smoothing splines.

Usage

fpca2s(Y, npc = NA, center = TRUE, argvals = NULL, smooth = TRUE)

Arguments

Y
data matrix (rows: observations; columns: grid of eval. points)
npc
how many smooth SVs to try to extract. If NA (the default), the hard thresholding rule of Gavish and Donoho (2014) is used. Application of this rule to functional PCA should be regarded as experimental.
center
center Y so that its column-means are 0? Defaults to TRUE
argvals
index vector of $J$ entries for data in Y; defaults to a regular sequence from 0 to 1.
smooth
logical; defaults to TRUE, if NULL, no smoothing of eigenvectors.

Value

  • An fpca object with components
  • Yhatpredicted data matrix
  • YObserved data
  • scoresmatrix of scores
  • mumean function
  • npcnumber of principal components
  • efunctionsmatrix of eigenvectors
  • evaluesvector of eigenvalues

Details

The eigenvalues are the same as those obtained from eigendecomposition of the sample covariance matrix. Please note that we expect to merge this function into fpca.ssvd in future versions of the package.

References

Xiao, L., Ruppert, D., Zipunnikov, V., and Crainiceanu, C., (2013), Fast covariance estimation for high-dimensional functional data. (submitted) http://arxiv.org/abs/1306.5718. Gavish, M., and Donoho, D. L. (2014). The optimal hard threshold for singular values is 4/sqrt(3). IEEE Transactions on Information Theory, 60(8), 5040--5053.

See Also

fpca.sc and fpca.face for FPCA based on smoothing a covariance estimate; fpca.ssvd for another SVD-based approach.

Examples

Run this code
#### settings
  I <- 50 # number of subjects
  J <- 3000 # dimension of the data
  t <- (1:J)/J # a regular grid on [0,1]
  N <- 4 #number of eigenfunctions
  sigma <- 2 ##standard deviation of random noises
  lambdaTrue <- c(1,0.5,0.5^2,0.5^3) # True eigenvalues

  case = 1
  ### True Eigenfunctions

  if(case==1) phi <- sqrt(2)*cbind(sin(2*pi*t),cos(2*pi*t),
                                   sin(4*pi*t),cos(4*pi*t))
  if(case==2) phi <- cbind(rep(1,J),sqrt(3)*(2*t-1),
                           sqrt(5)*(6*t^2-6*t+1),
                           sqrt(7)*(20*t^3-30*t^2+12*t-1))

  ###################################################
  ########     Generate Data            #############
  ###################################################
  xi <- matrix(rnorm(I*N),I,N);
  xi <- xi%*%diag(sqrt(lambdaTrue))
  X <- xi%*%t(phi); # of size I by J
  Y <- X + sigma*matrix(rnorm(I*J),I,J)

  results <- fpca2s(Y,npc=4,argvals=t)
  ###################################################
  ####               SVDS               ########
  ###################################################
  Phi <- results$efunctions
  eigenvalues <- results$evalues

  for(k in 1:N){
    if(Phi[,k]%*%phi[,k]< 0)
      Phi[,k] <- - Phi[,k]
  }

 ### plot eigenfunctions
 par(mfrow=c(N/2,2))
 seq <- (1:(J/10))*10
 for(k in 1:N){
      plot(t[seq],Phi[seq,k]*sqrt(J),type="l",lwd = 3,
           ylim = c(-2,2),col = "red",
           ylab = paste("Eigenfunction ",k,sep=""),
           xlab="t",main="SVDS")

      lines(t[seq],phi[seq,k],lwd = 2, col = "black")
      }

Run the code above in your browser using DataLab