copBasic (version 2.1.5)

blomCOP: The Blomqvist Beta of a Copula

Description

Compute the Blomqvist Beta \(\beta_\mathbf{C}\) of a copula (Nelsen, 2006, p. 182), which is defined at the middle or center of \(\mathcal{I}^2\) as

$$\beta_\mathbf{C} = 4\times\mathbf{C}\biggl(\frac{1}{2},\frac{1}{2}\biggr) - 1\mbox{,}$$

where the \(u = v = 1/2\) and thus shows that \(\beta_\mathbf{C}\) is based on the median joint probability. The Blomqvist Beta is also called the medial correlation coefficient. Nelsen also reports that “although, the Blomqvist Beta depends only on the copula only through its value at the center of \(\mathcal{I}^2\), but that [\(\beta_\mathbf{C}\)] nevertheless often provides an accurate approximation to both Spearman Rho rhoCOP and Kendall Tau tauCOP.” Kendall Tau \(\tau_\mathbf{C}\), Gini Gamma \(\gamma_\mathbf{C}\), and Spearman Rho \(\rho_\mathbf{C}\) in relation to \(\beta_\mathbf{C}\) satisfy the following inequalities (Nelsen, 2006, exer. 5.17, p. 185): $$\frac{1}{4}(1 + \beta_\mathbf{C})^2 - 1 \le \tau_\mathbf{C} \le 1 - \frac{1}{4}(1 - \beta_\mathbf{C})^2\mbox{,}$$ $$\frac{3}{16}(1 + \beta_\mathbf{C})^3 - 1 \le \rho_\mathbf{C} \le 1 - \frac{3}{16}(1 - \beta_\mathbf{C})^3\mbox{, and}$$ $$\frac{3}{8}(1 + \beta_\mathbf{C})^2 - 1 \le \tau_\mathbf{C} \le 1 - \frac{3}{8}(1 - \beta_\mathbf{C})^2\mbox{.}$$

A curious aside (Joe, 2014, p. 164) about the Gaussian copula is that Blomqvist Beta (blomCOP) is equal to Kendall Tau (tauCOP): \(\beta_\mathbf{C} = \tau_\mathbf{C}\) (see Note in med.regressCOP for a demonstration).

Usage

blomCOP(cop=NULL, para=NULL, as.sample=FALSE, ...)

Arguments

cop

A copula function;

para

Vector of parameters or other data structure, if needed, to pass to the copula;

as.sample

A logical controlling whether an optional R data.frame in para is used to compute the \(\hat\beta_\mathbf{C}\) (see Note); and

...

Additional arguments to pass to the copula.

Value

The value for \(\beta_\mathbf{C}\) is returned.

References

Joe, H., 2014, Dependence modeling with copulas: Boca Raton, CRC Press, 462 p.

Nelsen, R.B., 2006, An introduction to copulas: New York, Springer, 269 p.

See Also

footCOP, giniCOP, hoefCOP, rhoCOP, tauCOP, wolfCOP, joeskewCOP, uvlmoms

Examples

Run this code
# NOT RUN {
blomCOP(cop=PSP) # 1/3 precisely
# }
# NOT RUN {
# Nelsen (2006, exer. 5.17, p. 185) : All if(...) are TRUE
B <- blomCOP(cop=N4212cop, para=2.2); a1pB <- 1 + B; a1mB <- 1 - B
G <- giniCOP(cop=N4212cop, para=2.2); a <- 1/4; b <- 3/16; c <- 3/8
R <-  rhoCOP(cop=N4212cop, para=2.2)
K <-  tauCOP(cop=N4212cop, para=2.2, brute=TRUE) # numerical issues without brute
if( a*Bp1^2 - 1 <= K & K <= 1 - a*Bm1^2 ) print("TRUE") #
if( b*Bp1^3 - 1 <= R & R <= 1 - b*Bm1^3 ) print("TRUE") #
if( c*Bp1^2 - 1 <= G & G <= 1 - c*Bm1^2 ) print("TRUE") #
# }
# NOT RUN {
# }
# NOT RUN {
# A demonstration of a special feature of blomCOP for sample data.
# Joe (2014, p. 60; table 60) has 0.749 for GHcop(tau=0.5); n*var(hatB) as n-->infinity
theta <- GHcop(tau=0.5)$para; B <- blomCOP(cop=GHcop, para=theta); n <- 1000
H <- sapply(1:1000, function(i) { # Let us test that with pretty large sample size:
	                blomCOP(para=rCOP(n=n, cop=GHcop, para=theta), as.sample=TRUE) })
print(n*var(B-H)) # For 1,000 simulations of size n : 0.747, which matches Joe's result 
# }
# NOT RUN {
# }
# NOT RUN {
# Joe (2014, p. 57) says that sqrt(n)(B-HatBeta) is Norm(0, 1 - B^2)
n <- 10000; B <- blomCOP(cop=PSP) # Beta = 1/3
H <- sapply(1:100, function(i) { message(i,"-", appendLF=FALSE)
	               blomCOP(para=rCOP(n=n, cop=PSP), as.sample=TRUE) })
lmomco::parnor(lmomco::lmoms(sqrt(n)*(H-B))) # mu = 0.042; sigma = 0.973
# Joe (2014) : sqrt(1-B^2) == standard deviation (sigma) : (1-(1/3)^2) approx 0.973 
# }

Run the code above in your browser using DataLab