Learn R Programming

HDMFA (version 0.1.1)

High-Dimensional Matrix Factor Analysis

Description

High-dimensional matrix factor models have drawn much attention in view of the fact that observations are usually well structured to be an array such as in macroeconomics and finance. In addition, data often exhibit heavy-tails and thus it is also important to develop robust procedures. We aim to address this issue by replacing the least square loss with Huber loss function. We propose two algorithms to do robust factor analysis by considering the Huber loss. One is based on minimizing the Huber loss of the idiosyncratic error's Frobenius norm, which leads to a weighted iterative projection approach to compute and learn the parameters and thereby named as Robust-Matrix-Factor-Analysis (RMFA), see the details in He et al. (2023). The other one is based on minimizing the element-wise Huber loss, which can be solved by an iterative Huber regression algorithm (IHR), see the details in He et al. (2023) . In this package, we also provide the algorithm for alpha-PCA by Chen & Fan (2021) , the Projected estimation (PE) method by Yu et al. (2022). In addition, the methods for determining the pair of factor numbers are also given.

Copy Link

Version

Install

install.packages('HDMFA')

Monthly Downloads

155

Version

0.1.1

License

GPL-2 | GPL-3

Maintainer

Ran Zhao

Last Published

January 20th, 2024

Functions in HDMFA (0.1.1)

KPE

Estimating the Pair of Factor Numbers via Eigenvalue Ratios Corresponding to PE
KPCA

Estimating the Pair of Factor Numbers via Eigenvalue Ratios Corresponding to \(\alpha\)-PCA
MHFA

Matrix Huber Factor Analysis
KMHFA

Estimating the Pair of Factor Numbers via Eigenvalue Ratios or Rank Minimization.
alpha_PCA

Statistical Inference for High-Dimensional Matrix-Variate Factor Model
PE

Projected Estimation for Large-Dimensional Matrix Factor Models