Network Meta-Analytic Predictive (NAP) Priors
Description
The NAPrior package facilitates the implementation of the Network Meta-Analytic Predictive (NAP) prior framework, specifically designed to address changes in the Standard of Care (SoC) during ongoing randomized controlled trials (RCTs). The framework synthesizes in-trial data from both pre- and post-SoC change periods by leveraging external trial data—specifically the head-to-head comparisons between the original and new SoC that established the new SoC—to bridge the two phases of evidence.
To ensure robust inference, the package implements two robustified priors: (i) the mixture NAP (mNAP), which incorporates a noninformative prior via a fixed mixing weight, and (ii) the elastic NAP (eNAP), which adaptively adjusts mixing weight and information borrowing based on the consistency between direct and indirect evidence. The NAP framework is fully prespecifiable, easy to calibrate, and computationally straightforward. This package provides a comprehensive toolkit to calibrate eNAP tuning parameters, generate NAP priors, simulate operating characteristics, and obtain posterior distributions.
Installation
System requirements
The NAPrior package fits Bayesian models using JAGS via the
R2jags interface.
Please ensure that JAGS is installed on your system before running model
functions.
- macOS:
brew install jags - Ubuntu/Debian:
sudo apt-get install jags - Windows: https://mcmc-jags.sourceforge.io/
Install and load
install.packages("devtools")
devtools::install_github("EstravenZZZ/NAPrior")library(NAPrior)Quick start
Consider a scenario in which the SoC changes mid-trial in a registrational or pivotal RCT that was initially designed to compare an experimental treatment $E$ against an SoC $C_1$ using 1:1 randomization of $2N$ patients. At the outset, a total of $2n_1$ patients are randomized between $E$ and $C_1$. Subsequently, a mid-trial SoC change occurs, and another $2n_2$ patients are randomized between $E$ and the new SoC $C_2$. As a result, the trial data consist of two components: data directly comparing $E$ versus $C_1$ (denoted as $D_{E,C_1}$) and data directly comparing $E$ versus $C_2$ (denoted as $D_{E,C_2}$). Meanwhile, summary statistics from external trial(s) motivated SoC change is also available (denoted as $D_{C_2,C_2}$). The primary objective of the RCT is to evaluate the treatment effect of $E$ versus $C_2$, denoted as $\theta_{E, C_2}$. For estimating $\theta_{E, C_2}$, $D_{E,C_2}$ provides direct evidence while $D_{E,C_1}$ and $D_{C_2,C_2}$ provide indirect evidence.
Let $y_{E,C_2}$, $y_{E,C_1}$, and $y_{C_2,C_1}$ denote the estimated log
hazard ratios (log-HRs) calculated from the datasets $D_{E, C_2}$,
$D_{E, C_1}$ and $D_{C_2, C_1}$, respectively, with sampling variances
$s^2_{E,C_2}$, $s^2_{E,C_1}$, and $s^2_{C_2,C_1}$. These estimates can
be derived, for example, using the Cox proportional hazards model.
Following standard meta-analytic convention, we assume that
$s^2_{E,C_2}$, $s^2_{E,C_1}$, and $s^2_{C_2,C_1}$ are known. This
package accommodates external data ($D_{C_2,C_2}$) from either a single
trial and multiple trials by allowing users to supply a scalar or a
vector for y_C2C1 and s_C2C1 arguments accordingly.
1) Calibrate the tuning parameters for eNAP prior
The eNAP uses dynamic weight based on Bucher test statistic $Z$, to quantify the extent of consistency between direct and indirect evidence, and mapped to the mixing weight using an elastic function:
$$w(Z) = { 1 + \exp\bigl(a + b \log(Z + 1)\bigr)}^{-1},$$
where Bucher test statistic
$$Z = \frac{ \bigl| y_{E,C_2} - (y_{E,C_1} - y_{C_2,C_1}) \bigr| }{ s^2_{E,C_2} + s^2_{E,C_1} + s^2_{C_2,C_1} }.$$
To calibrate the tuning parameters $(a, b)$, the user provides:
$\delta$: A clinically meaningful difference on the log-HR scale; differences beyond this threshold indicate strong inconsistency between direct and indirect evidence.
$t_1$: The target posterior mixing weights under exact consistency, typically set close to 1 (e.g., $t_1 = 0.999$).
$t_0$: The target posterior mixing weights under strongly inconsistency $Z_\delta=\delta/(s^2_{E,C_2} + s^2_{E,C_1} + s^2_{C_2,C_1})$, typically set close to 0 (e.g., $t_0 = 0.05$)
These inputs calibrate $(a, b)$ such that the updated posterior weight $w'$ satisfies $w'(Z_0) = t_1$ and $w'(Z_\delta) = t_0$. A companion plot method is provided for visualizing the resulting posterior updated weight; simply callplot()on the returned object to view the weighting function relative to evidence consistency..
tuned_ab <- tune_param_eNAP(
y_C2C1=c(-0.4,-0.5,-0.5),
s_EC2 = 0.12^2, # Var(E:C2)
s_EC1 = 0.16^2, # Var(E:C1)
s_C2C1 = c(0.12^2, 0.11^2, 0.15^2), # vector => multiple external trials
delta = 0.5, # clinically meaningful difference on log-HR
t1 = 0.999, # near full borrowing at Z = 0
t0 = 0.05 # near zero borrowing at Z(delta)
)
c(tuned_ab$a,tuned_ab$b)
#> [1] -1.903302 5.848514
plot(tuned_ab)2) Construct NAP priors
To construct an mNAP or eNAP prior, use the NAP_prior() function. This
requires summary statistics for the indirect evidence ($D_{E,C_1}$ and
$D_{C_2,C_2}$), and a selection for the weighting method (“adaptive” for
eNAP or “fixed” for mNAP). For mNAP, a fixed mixing weight must
specified (a weight of 1 results in the NAP prior, assuming full
transitivity). For eNAP, the function requiresthe tuning parameters
$(a,b)$. A companion plot() method is available to visualize the
resulting prior distribution.
Since the dynamic weight for eNAP depends on direct evidence
($D_{E,C_2}$) that may not yet be observed (e.g., at the time of the SoC
change), NAP_prior() offers two operational modes:
When direct evidence:
If an assumed log-HR and sampling variance for $D_{E,C_2}$ are provided, the function calculates the specific dynamic weight and resulting prior under those assumptions.When direct evidence:
If no assumed direct evidence is provided, the function returns the informative (NAP) and vague components separately, allowing for later synthesis.
The function returns an object of class "NAP_prior", which serves as
the input for subsequent simulations or posterior data analyses.
NAP
NAP_test1 <- NAP_prior(
weight_mtd = "fixed", w = 1, # w = 1 ⇒ informative component only (NAP)
y_EC1 = -0.36, s_EC1 = 0.16^2,
y_C2C1 = -0.30, s_C2C1 = 0.14^2 # single external → FE
)
NAP_test1$table
#> NAP (Informative) Vague
#> Mixing Weight 1.00000000 0
#> Mean -0.05999666 0
#> Variance 0.04519896 1000
#> ESS (events) 88.49761046 NA
plot(NAP_test1)mNAP with a fixed weight of 0.5
mNAP_test1 <- NAP_prior(
weight_mtd = "fixed", w = 0.50, # fixed mixture weight
y_EC1 = -0.36, s_EC1 = 0.16^2,
y_C2C1 = -0.30, s_C2C1 = 0.14^2, # single external → FE
tau0 = 1000
)
mNAP_test1$table
#> NAP (Informative) Vague
#> Mixing Weight 0.50000000 5e-01
#> Mean -0.05999666 0e+00
#> Variance 0.04519896 1e+03
#> ESS (events) 88.49761046 NA
plot(mNAP_test1,main="mNAP prior")eNAP without direct data $D_{E,C_2}$ provided
eNAP_test1 <- NAP_prior(
weight_mtd = "adaptive",
a = tuned_ab$a, b = tuned_ab$b, # from calibration
y_EC1 = -0.36, s_EC1 = 0.16^2, # E:C1 (current, pre-change)
y_C2C1 = c(-0.28, -0.35, -0.31), # C2:C1 (external trials)
s_C2C1 = c(0.12^2, 0.11^2, 0.15^2),
tau0 = 1000 # vague variance
)
eNAP_test1$table
#> NAP (Informative) Vague
#> Mixing Weight TBD TBD
#> Mean -0.0437723254666285 0
#> Variance 0.0306875093991601 1000
#> ESS (events) 130.346192255976 <NA>eNAP with direct data $D_{E,C_2}$ provided
eNAP_test2 <- NAP_prior(
weight_mtd = "adaptive",
a = tuned_ab$a, b = tuned_ab$b, # from calibration
y_EC1 = -0.36, s_EC1 = 0.16^2, # E:C1 (current, pre-change)
y_C2C1 = c(-0.28, -0.35, -0.31), # C2:C1 (external trials)
s_C2C1 = c(0.12^2, 0.11^2, 0.15^2),
y_EC2 = 0, s_EC2 = 0.2^2,
tau0 = 1000 # vague variance
)
eNAP_test2$table
#> NAP (Informative) Vague
#> Mixing Weight 0.73337079 0.2666292
#> Mean -0.04377233 0.0000000
#> Variance 0.03068751 1000.0000000
#> ESS (events) 130.34619226 NA
plot(eNAP_test2,main="eNAP prior")3) Simulate operating characteristics
With previously obtained object from NAP_prior() function, use
NAP_oc() function to simulate operating characteristics.
set.seed(123)
# Run JAGS quietly
oc <- NULL
invisible(capture.output(
oc <- NAP_oc(
NAP_prior = eNAP_test1,
theta_EC2 = 0.00, # true log-HR for E:C2
n_EC2 = 400, # total N for the post-SoC-change period
lambda = 1, # randomization ratio for the post-SoC-change period
sim_model = "Weibull", # "Exponential" or "Weibull"
model_param = c(shape = 1.2, rate = 0.05),
nsim = 100,
iter = 3000,
chains = 4
),
type = "output" # capture JAGS console output
))
# Now show only the summary in the knitted doc
summary(oc)
#> rep post_mean post_sd low95.2.5%
#> Min. : 1.00 Min. :-0.22316 Min. :0.08741 Min. :-0.40601
#> 1st Qu.: 25.75 1st Qu.:-0.07669 1st Qu.:0.08882 1st Qu.:-0.25138
#> Median : 50.50 Median :-0.01197 Median :0.08965 Median :-0.18399
#> Mean : 50.50 Mean :-0.01474 Mean :0.08987 Mean :-0.19030
#> 3rd Qu.: 75.25 3rd Qu.: 0.04194 3rd Qu.:0.09050 3rd Qu.:-0.13200
#> Max. :100.00 Max. : 0.15937 Max. :0.09567 Max. :-0.02844
#> hi95.97.5% prob_E_better prior_weight post_weight
#> Min. :-0.04267 Min. :0.04383 Min. :0.05955 Min. :0.8153
#> 1st Qu.: 0.09562 1st Qu.:0.31979 1st Qu.:0.28602 1st Qu.:0.9792
#> Median : 0.16111 Median :0.54958 Median :0.50131 Median :0.9918
#> Mean : 0.16160 Mean :0.55097 Mean :0.48724 Mean :0.9767
#> 3rd Qu.: 0.21759 3rd Qu.:0.80788 3rd Qu.:0.69114 3rd Qu.:0.9967
#> Max. : 0.35467 Max. :0.99133 Max. :0.86825 Max. :0.9997
#> sigma_hat
#> Min. :0.1144
#> 1st Qu.:0.1243
#> Median :0.1294
#> Mean :0.1306
#> 3rd Qu.:0.1367
#> Max. :0.15254) Calculate posterior
To obtain posterior posterior of target estimand $\theta_{E,C_2}$, pass
the object generated by the NAP_prior() function, along with the
observed direct evidence (log_HR and sampling variance from $D_{E,C2}$)
to the NAP_posterior() function.
res <- NAP_posterior(
NAP_prior = eNAP_test1,
y_EC2 = -0.20,
s_EC2 = 0.12^2,
iter = 4000,
chains = 4
)
#> Compiling model graph
#> Resolving undeclared variables
#> Allocating nodes
#> Graph information:
#> Observed stochastic nodes: 3
#> Unobserved stochastic nodes: 6
#> Total graph size: 34
#>
#> Initializing model
res$posterior_sum # mean, sd, 95% CI, prob_E_better, prior weight, post weight
#> post_mean post_sd low95 hi95 prob_E_better prior_weight
#> 1 -0.1778259 0.1105583 -0.4024329 0.03350202 0.951 0.2105426
#> post_weight sigma_hat
#> 1 0.9505 0.2902954
res$enap_prior # actual eNAP prior at analysis time with realized dynamic weight
#> NAP (Informative) Vague
#> Mixing Weight 0.21054264 0.7894574
#> Mean -0.04377233 0.0000000
#> Variance 0.03068751 1000.0000000
#> ESS (events) 130.34619226 NA
# res$jags_fit is stored but does not print by defaultCore functions
tune_param_eNAP()— Calibrates the eNAP tuning parameters $(a, b)$ based on user-defined consistency targets.NAP_prior()— Generates mNAP and eNAP prior objects using indirect evidence and specified weighting methods.NAP_posterior()— Computes the posterior distribution of the target estimand by combining aNAP_priorobject with observed direct evidence ($D_{E,C2}$).NAP_oc()— Simulates operating characteristics.
JAGS model specification
The package includes default BUGS/JAGS model strings as data objects:
jags_model_FEjags_model_RE
You may pass
- a path to a
.txtmodel file (e.g.,
model = "path/to/model_RE.txt"), or
- a string containing BUGS/JAGS code (e.g.,
model = jags_model_RE).
Internally, models are routed through .model_path_from(), which writes
text to a temporary file and passes it to R2jags::jags().
Tips & diagnostics
- Inputs: Sampling variances must be positive and finite.
- Stability: If tuning parameters $|a|$ or $b$ reaches internally pre-specified upper limits (by defualt 5 and 50), respectively, recommend reducing $t_1$, increasing $t_0$, or enlarging $\delta$ to enhance the reliability of posterior inference.
- Reproducibility: Set a seed before simulations (
set.seed()). - JAGS messages: Lines about “resolving undeclared variables” are normal. Most JAGS errors arise from name or length mismatches in the data or model.
Citing
If you use NAPrior in publications, please cite both the methodological paper and this package (full citation to be added once available).
Maintainer
Chunyi Zhang (czhang12@mdanderson.org)