The user specify the (posterior) mean and standard error (or posterior standard deviation) of two estimated treatment effects, X and Y, that refer to the same pairwise comparison and are assumed to follow a normal distribution. The function returns the Kullback-Leibler Divergence (KLD) measure of 1) approximating X with Y, 2) approximating Y with X, and 3) their average.
kld_measure(mean_y, sd_y, mean_x, sd_x)The function return the following numeric results:
| kld_sym | The symmetric KLD value as the average of two KLD values . |
| kld_x_true | The KLD value when approximating X by Y (X is the 'truth'). |
| kld_y_true | The KLD value when approximating Y by X (Y is the 'truth'). |
A real number that refers to the mean of the estimated treatment effect Y on the scale of the selected effect measure (in logarithmic scale for relative effect measures).
A positive integer that refers to the posterior standard deviation or the standard error of the estimated treatment effect Y on the scale of the selected effect measure (in logarithmic scale for relative effect measures).
A real number that refers to the mean of the estimated treatment effect X on the scale of the selected effect measure (in logarithmic scale for relative effect measures).
A positive integer that refers to the posterior standard deviation or the standard error of the estimated treatment effect X on the scale of the selected effect measure (in logarithmic scale for relative effect measures).
Kullback S, Leibler RA. On information and sufficiency. Ann Math Stat 1951;22(1):79--86. doi: 10.1214/aoms/1177729694
kld_inconsistency,
kld_inconsistency_user, robustness_index,
robustness_index_user