copBasic (version 2.1.5)

joeskewCOP: Joe's Nu-Skew and the copBasic Nu-Star of a Copula

Description

Compute the measure of permutation asymmetry, which can be thought of as bivariate skewness, named for the copBasic package as Nu-Skew \(\nu_\mathbf{C}\) of a copula according to Joe (2014, p. 66) by $$\nu_\mathbf{C} = 3\mathrm{E}[UV^2 - U^2V] = 6\int\!\!\int_{\mathcal{I}^2} (v-u)\mathbf{C}(u,v)\, \mathrm{d}u\mathrm{d}v\mbox{.}$$ This definition is effectively the type="nu" for the function for which the multiplier \(6\) has been converted to \(96\) as explained in the Note.

Numerical results indicate \(\nu_\mathbf{W} \approx 0\) (W), \(\nu_\mathbf{\Pi} = 0\) (P), \(\nu_\mathbf{M} \approx 0\) (M), \(\nu_\mathbf{PL} \approx 0\) for all \(\Theta\) (PLcop), and the \(\nu^\star_\mathbf{GH} = 0\) (GHcop); copulas with mirror symmetry across the equal value line have \(\nu_\mathbf{C} = 0\).

Asymmetric copulas do exist. For example, consider an asymmetric Gumbel--Hougaard \(\mathbf{GH}\) copula with \(\Theta_p = (5,0.8,p)\):

  optimize(function(p) { nuskewCOP(cop=GHcop, para=c(5,0.8, p)) },
           c(0,0.99) )$minimum
  UV <- simCOP(n=10000, cop=GHcop, c(5,0.8, 0.2836485)) # inspect the graphics
  48*mean(UV$U*$V^2 - UV$U^2*UV$V) # -0.2847953 (not the 3rd parameter)

The minimization yields \(\nu_{\mathbf{GH}(5, 0.8, 0.2836485)} = -0.2796104\), which is close the expectation computed where \(48 = 96/2\).

A complementary definition is supported, triggered by type="nustar", and is computed by $$\nu^\star_\mathbf{C} = 12\int\!\!\int_{\mathcal{I}^2} (v+u)\mathbf{C}(u,v)\, \mathrm{d}u\mathrm{d}v - 4\mbox{,}$$ which has been for the copBasic package, \(\nu^\star_\mathbf{C}\) is named as Nu-Star, which the \(12\) and the \(-4\) have been chosen so that numerical results indicate \(\nu^\star_\mathbf{W} = -1\) (W), \(\nu^\star_\mathbf{\Pi} = 0\) (P), and \(\nu^\star_\mathbf{M} = +1\) (M).

Lastly, the uvlmoms function provides for a quantile-based measure of bivariate skewness based on the difference \(U - V\) that also is discussed by Joe (2014, p. 66).

Usage

joeskewCOP(cop=NULL, para=NULL, type=c("nu", "nustar", "nuskew"),
                               as.sample=FALSE, brute=FALSE, delta=0.002, ...)

nuskewCOP(cop=NULL, para=NULL, as.sample=FALSE, brute=FALSE, delta=0.002, ...) nustarCOP(cop=NULL, para=NULL, as.sample=FALSE, brute=FALSE, delta=0.002, ...)

Arguments

cop

A copula function;

para

Vector of parameters or other data structure, if needed, to pass to the copula;

type

The type of metric to compute (nu and nuskew are synonymous for \(\nu_\mathbf{C}\) and nustar is for \(\nu^\star_\mathbf{C}\));

brute

Should brute force be used instead of two nested integrate() functions to perform the double integration;

delta

The \(\mathrm{d}u\) and \(\mathrm{d}v\) for the brute force integration using brute;

as.sample

A logical controlling whether an optional R data.frame in para is used to compute the sample \(\hat\nu\) or \(\hat\nu^\star\) (see Note). If set to -1, then the message concerning CPU effort will be surpressed; and

...

Additional arguments to pass.

Value

The value for \(\nu_\mathbf{C}\) or \(\nu^\star_\mathbf{C}\) is returned.

Details

The implementation of joeskewCOP for copBasic provides the second metric of asymmetry, but why? Consider the results that follow:

  joeskewCOP(cop=GHcop, para=c(5, 0.8,    0.2836485), type="nu")
     # -0.2796104
  joeskewCOP(cop=GHcop, para=c(5, 0.2836485,    0.8), type="nu")
     # +0.2796103
  joeskewCOP(cop=GHcop, para=c(5, 0.8,    0.2836485), type="nu")
     #  0.3571276
  joeskewCOP(cop=GHcop, para=c(5, 0.2836485,    0.8), type="nu")
     #  0.3571279
  tauCOP(    cop=GHcop, para=c(5, 0.2836485,    0.8))
     #  0.2443377

The demonstration shows---at least for the symmetry (switchability) of the 2nd and 3rd parameters (\(\pi_2\) and \(\pi_3\)) of the asymmetric \(\mathbf{GH}\) copula---that the first definition \(\nu\) is magnitude symmetric but carries a sign change. The demonstration shows magnitude and sign stability for \(\nu^\star\), and ends with Kendall Tau (tauCOP). Collectively, Kendall Tau (or the other symmetric measures of association, e.g. blomCOP, footCOP, giniCOP, hoefCOP, rhoCOP, wolfCOP) when combined with \(\nu\) and \(\nu^\star\) might provide a framework for parameter optimization of the asymmetric \(\mathbf{GH}\) copula (see below).

The asymmetric \(\mathbf{GH}_{(5, 0.2836485, 0.8)}\) is not radial (isCOP.radsym) or permutation (isCOP.permsym), but if \(\pi_2 = \pi_3\) then the resulting \(\mathbf{GH}\) copula is not radially symmetric but is permutation symmetric:

  isCOP.radsym( cop=GHcop, para=c(5, 0.2836485, 0.8)) # FALSE
  isCOP.permsym(cop=GHcop, para=c(5, 0.2836485, 0.8)) # FALSE
  isCOP.radsym( cop=GHcop, para=c(5, 0.8,       0.8)) # FALSE
  isCOP.permsym(cop=GHcop, para=c(5, 0.8,       0.8)) # TRUE

The use of \(\nu_\mathbf{C}\) and \(\nu^\star_\mathbf{C}\) with a measure of association is just suggested above for parameter optimization. Suppose we have \(\mathbf{GH}_{(5,0.5,0.7)}\) with Spearman Rho \(\rho = 0.4888\), \(\nu = 0.001475\), and \(\nu^\star = 0.04223\), and the asymmetric \(\mathbf{GH}\) coupla is to be fit. Parameter estimation for the asymmetric \(\mathbf{GH}\) is accomplished by

  "fitGHcop" <- function(hats, assocfunc=rhoCOP, init=NA, eps=1E-4, ...) {
     H <- GHcop # shorthand for the copula
     "objfunc" <- function(par) {
        par[1]   <- ifelse(par[1] < 1, return(Inf), exp(par[1])) # edge check
        par[2:3] <-  pnorm(par[2:3]) # detransform
        hp <- c(assocfunc(H, par), nuskewCOP(H, par), nustarCOP(H, par))
        return(sum((hats-hp)^2))
     }
     # Theta=1 and Pi2 = Pi3 = 1/2 # as default initial estimates
     if(is.na(init)) init <- c(1, rep(1/2, times=2))
     opt  <- optim(init, objfunc, ...); par <- opt$par
     para <- c( exp(par[1]), pnorm(par[2:3]) )
     names(para) <- c("Theta", "Pi2", "Pi3")
     fit <- c(assocfunc(H, para), nuskewCOP(H, para), nustarCOP(H, para))
     txt <- c("AssocMeasure", "NuSkew", "NuStar")
     names(fit) <- txt; names(hats) <- txt
     if(opt$value > eps) warning("inspect the fit")
     return(list(para=para, fit=fit, given=hats, optim=opt))
  }
  father <- c(5,.5,.7)
  densityCOPplot(cop=GHcop, para=father, contour.col=8)
  fRho  <- rhoCOP(   cop=GHcop, father)
  fNu   <- nuskewCOP(cop=GHcop, father)
  fStar <- nustarCOP(cop=GHcop, father)

child <- fitGHcop(c(fRho, fNu, fStar))$para densityCOPplot(cop=GHcop, para=child, ploton=FALSE)

cRho <- rhoCOP( cop=GHcop, child) cNu <- nuskewCOP(cop=GHcop, child) cStar <- nustarCOP(cop=GHcop, child) message("Father stats: ", paste(fRho, fNu, fStar, sep=", ")) message("Child stats: ", paste(cRho, cNu, cStar, sep=", ")) message("Father para: ", paste(father, collapse=", ")) message("Child para: ", paste(child, collapse=", "))

The initial parameter estimate has the value \(\Theta = 1\), which is independence for the one parameter \(\mathbf{GH}\). The two other parameters are set as \(\pi_2 = \pi_3 = 1/2\) to be in the mid-point of their domain. The transformations using the log() \(\leftrightarrow\) exp() and qnorm() \(\leftrightarrow\) pnorm() functions in R are used to keep the optimization in the viable parameter domain. The results produce a fitted copula of \(\mathbf{GH}_{(4.907, 0.5006, 0.7014)}\). This fit aligns well with the parent, and the three statistics are essentially matched during the fitting.

The \(\nu^\star_\mathbf{C}\) can be similar to rhoCOP, but differences do exist. In the presence of radial symmetry, (\(\nu_\mathbf{C} == 0\)), the \(\nu^\star_\mathbf{C}\) is nearly equal to Spearman Rho for some copulas. Let us test further:

  p <- 10^seq(0,2,by=.01)
  s <- sapply(p, function(t) nustarCOP(cop=GHcop, para=c(t)))
  r <- sapply(p, function(t)    rhoCOP(cop=GHcop, para=c(t)))
  plot(p,s, log="x", type="l", col=3, lwd=3); lines(p,r)

Now let us add some asymmetry

  s <- sapply(p, function(t) nustarCOP(cop=GHcop, para=c(t, 0.25, 0.75)))
  r <- sapply(p, function(t)    rhoCOP(cop=GHcop, para=c(t, 0.25, 0.75)))
  plot(p,s, log="x", type="l", col=3, lwd=3); lines(p,r)

Now let us choose a different (the Clayton) copula

  s <- sapply(p, function(t) nustarCOP(cop=CLcop, para=c(t)))
  r <- sapply(p, function(t)    rhoCOP(cop=CLcop, para=c(t)))
  plot(p,s, log="x", type="l", col=3, lwd=3); lines(p,r)

References

Joe, H., 2014, Dependence modeling with copulas: Boca Raton, CRC Press, 462 p.

See Also

uvskew, blomCOP, footCOP, giniCOP, hoefCOP, rhoCOP, tauCOP, wolfCOP

Examples

Run this code
# NOT RUN {
nuskewCOP(cop=GHcop,para=c(1.43,1/2,1))*(6/96) # 0.005886 (Joe, 2014, p. 184; 0.0059)

# }
# NOT RUN {
joeskewCOP(            cop=GHcop, para=c(8, .7, .5)) # -0.1523491
joeskewCOP(            cop=GHcop, para=c(8, .5, .7)) # +0.1523472
# UV <- simCOP(n=1000, cop=GHcop, para=c(8, .7, .5)) # see the switch in
# UV <- simCOP(n=1000, cop=GHcop, para=c(8, .5, .7)) # curvature
# }
# NOT RUN {
# }
# NOT RUN {
para=c(19,0.3,0.8); set.seed(341)
nuskew <-  nuskewCOP( cop=GHcop, para=para) # 0.3057744
UV <- simCOP(n=10000, cop=GHcop, para=para) #   a large simulation
mean((UV$U - UV$V)^3)/(6/96)                # 0.3127398

# Two other definitions of skewness follow and are not numerically the same.
uvskew(u=UV$U, v=UV$V, umv=TRUE)  # 0.3738987  (see documentation uvskew)
uvskew(u=UV$U, v=UV$V, umv=FALSE) # 0.3592739  ( or documentation uvlmoms)
# Yet another definition of skew, which requires large sample approximation
# using the L-comoments (3rd L-comoment is L-coskew).
lmomco::lcomoms2(UV)$T3 # L-coskew of the simulated values [1,2] and [2,1]
#             [,1]        [,2]
#[1,]  0.007398438  0.17076600
#[2,] -0.061060260 -0.00006613
# See the asymmetry in the two L-coskew values and consider this in light of
# the graphic produced by the simCOP() called for n=10,000. The T3[1,1] is
# the sampled L-skew (univariate) of the U margin and T3[2,2] is the same
# but for the V margin. Because the margins are uniform (ideally) then these
# for suitable large sample must be zero because the L-skew of the uniform
# distribution is by definition zero.
#
# Now let us check the sample estimator for sample of size n=300, and the
# t-test will (should) result in acceptance of the NULL hypothesis.
S <- replicate(60, nuskewCOP(para=simCOP(n=300, cop=GHcop, para=para,
                                         graphics=FALSE), as.sample=TRUE))
t.test(S, mu=nuskew)
#t = 0.004633, df = 59, p-value = 0.9963
#alternative hypothesis: true mean is not equal to 0.3057744
#95 percent confidence interval:
# 0.2854282 0.3262150
#sample estimates:
#mean of x
#0.3058216 
# }
# NOT RUN {
# }
# NOT RUN {
# Let us run a large ensemble of copula properties that use the whole copula
# (not tail properties). We composite a Plackett with a Gumbel-Hougaard for
# which the over all association (correlation) sign is negative, but amongst
# these statistics with nuskew and nustar at the bottom, there are various
# quantities that can be extracted. These could be used for fitting.
set.seed(873)
para <- list(cop1=PLcop, cop2=GHcop, alpha=0.6, beta=0.9,
             para1=.005, para2=c(8.3,0.25,0.7))
UV <- simCOP(1000, cop=composite2COP, para=para) # just to show
  blomCOP(composite2COP, para)            # -0.4078657
  footCOP(composite2COP, para)            # -0.2854227
  hoefCOP(composite2COP, para)            # +0.5713775
  lcomCOP(composite2COP, para)$lcomUV[3]  # +0.1816084
  lcomCOP(composite2COP, para)$lcomVU[3]  # +0.1279844
   rhoCOP(composite2COP, para)            # -0.5688417
rhobevCOP(composite2COP, para)            # -0.2005210
   tauCOP(composite2COP, para)            # -0.4514693
  wolfCOP(composite2COP, para)            # +0.5691933
nustarCOP(composite2COP, para)            # -0.5172434
nuskewCOP(composite2COP, para)            # +0.0714987 
# }

Run the code above in your browser using DataLab