Learn R Programming

EFAfactors (version 1.2.4)

Determining the Number of Factors in Exploratory Factor Analysis

Description

Provides a collection of standard factor retention methods in Exploratory Factor Analysis (EFA), making it easier to determine the number of factors. Traditional methods such as the scree plot by Cattell (1966) , Kaiser-Guttman Criterion (KGC) by Guttman (1954) and Kaiser (1960) , and flexible Parallel Analysis (PA) by Horn (1965) based on eigenvalues form PCA or EFA are readily available. This package also implements several newer methods, such as the Empirical Kaiser Criterion (EKC) by Braeken and van Assen (2017) , Comparison Data (CD) by Ruscio and Roche (2012) , and Hull method by Lorenzo-Seva et al. (2011) , as well as some AI-based methods like Comparison Data Forest (CDF) by Goretzko and Ruscio (2024) and Factor Forest (FF) by Goretzko and Buhner (2020) . Additionally, it includes a deep neural network (DNN) trained on large-scale datasets that can efficiently and reliably determine the number of factors.

Copy Link

Version

Install

install.packages('EFAfactors')

Monthly Downloads

210

Version

1.2.4

License

GPL-3

Maintainer

Haijiang Qin

Last Published

October 14th, 2025

Functions in EFAfactors (1.2.4)

Hull

the Hull Approach
GenData

Simulating Data Following John Ruscio's RGenData
KGC

Kaiser-Guttman Criterion
MAP

Minimum Average Partial (MAP) Test
check_python_libraries

Check and Install Python Libraries (numpy and onnxruntime)
data.DAPCS

20-item Dependency-Oriented and Achievement-Oriented Psychological Control Scale (DAPCS)
NN

the pre-trained Neural Networks for Determining the Number of Factors
PA

Parallel Analysis
af.softmax

An Activation Function: Softmax
STOC

Scree Test Optimal Coordinate (STOC)
load.NN

Load the the pre-trained Neural Networks for Determining the Number of Factors
extractor.feature.NN

Extracting features for the pre-trained Neural Networks for Determining the Number of Factors
print

Print Methods
predictLearner.classif.xgboost.earlystop

Prediction Function for the Tuned XGBoost Model with Early Stopping
data.scaler.LSTM

the Scaler for the pre-trained Long Short Term Memory (LSTM) Network
plot

Plot Methods
normalizor

Feature Normalization for the pre-trained Neural Networks for Determining the Number of Factors
extractor.feature.FF

Extracting features According to Goretzko & Buhner (2020)
data.datasets.LSTM

Subset Dataset for Training the Long Short Term Memory (LSTM) Network
data.scaler.DNN

the Scaler for the pre-trained Deep Neural Network (DNN)
factor.analysis

Factor Analysis by Principal Axis Factoring
load.xgb

Load the Tuned XGBoost Model
model.xgb

the Tuned XGBoost Model for Determining the Number of Facotrs
load.scaler

Load the Scaler for the pre-trained Neural Networks for Determining the Number of Factors
data.datasets.DNN

Subset Dataset for Training the Deep Neural Network (DNN)
data.bfi

25 Personality Items Representing 5 Factors
EFAkmeans

K-means for EFA
CD

the Comparison Data (CD) Approach
EFAhclust

Hierarchical Clustering for EFA
EFAsim.data

Simulate Data that Conforms to the theory of Exploratory Factor Analysis.
EFAscreet

Scree Plot
EFAvote

Voting Method for Number of Factors in EFA
EKC

Empirical Kaiser Criterion
FF

Factor Forest (FF) Powered by An Tuned XGBoost Model for Determining the Number of Factors
EFAindex

Various Indeces in EFA
CDF

the Comparison Data Forest (CDF) Approach