Learn R Programming

corrfuns (version 1.2)

EL and EEL test for a correlation coefficient: EL and EEL test for a correlation coefficient

Description

EL and EEL test for a correlation coefficient.

Usage

el.cor.test(y, x, rho, tol = 1e-07)
el.cor.test(y, x, rho, tol = 1e-07)

Value

A list including:

iters

The number of iterations required by the Newton-Raphson. If no convergence occured this is NULL.

info

A vector with three values, the value of \(\lambda\), the test statistic and its associated asymptotic p-value. If no convergence occured, the value of the \(\lambda\) is NA, the value of test statistic is \(10^5\) and the p-value is 0. No convergence can be interpreted as rejection of the hypothesis test.

p

The probabilities of the EL or of the EEL. If no covnergence occured this is NULL.

Arguments

y

A numerical vector.

x

A numerical vector.

rho

The hypothesized value of the true partial correlation.

tol

The tolerance vlaue to terminate the Newton-Raphson algorithm.

Author

Michail Tsagris

R implementation and documentation: Michail Tsagris mtsagris@uoc.gr.

Details

The empirical likelihood (EL) or the exponential empirical likelihood (EEL) test is performed for the Pearson correlation coefficient. At first we standardise the data so that the sample correlation equal the inner product between the two variables, \(\hat{r}=\sum_{i=1}^nx_iy_i\), where \(n\) is the sample size.

The EL works by minimizing the quantity \(\sum_{i=1}^n\log{nw_i}\) subject to the constraints \(\sum_{i=1}^nw_i(x_iy_i-\rho)=0\), \(\sum_{i=1}^nw_i=1\) and \(\rho\) is the hypothesised correlation coefficient, under the \(H_0\). After some algebra the form of the weights \(w_i\) becomes $$ w_i=\frac{1}{n}\frac{1}{1+\lambda(x_iy_i-\rho)}, $$ where \(\lambda\) is the Lagrange multiplier of the first (zero sum) constraint. Thus, the zero sum constraint becomes \(\sum_{i=1}^n\frac{x_iy_i-\rho}{1 + \lambda(x_iy_i-\rho)}=0\) and this equality is solved with respect to \(\lambda\) via the Newton-Raphson algortihm. The derivative of this function is \(-\sum_{i=1}^n\frac{(x_iy_i-\rho)^2}{\left[1 + \lambda(x_iy_i-\rho)\right]^2}=0\).

The EL works by minimizing the quantity \(\sum_{i=1}^nw_i\log{nw_i}\) subject to the same constraints as before, \(\sum_{i=1}^nw_i(x_iy_i-\rho)=0\) or \(\sum_{i=1}^nw_i(x_iy_i)=\rho\), \(\sum_{i=1}^nw_i=1\). After some algebra the form of the weights \(w_i\) becomes $$ w_i=\frac{e^{\lambda x_iy_i}}{\sum_{j=1}^ne^{\lambda x_jy_j}}, $$ where, again, \(\lambda\) is the Lagrange multiplier of the first (zero sum) constraint. Thus, the zero sum constraint becomes \(\frac{\sum_{i=1}^nx_iy_ie^{\lambda x_iy_i}}{\sum_{j=1}^ne^{\lambda x_jy_j}}-\rho=0\) and this equality is solved with respect to \(\lambda\) via the Newton-Raphson algortihm. The derivative of this function is $$ \frac{\sum_{i=1}^n(x_iy_i)^2e^{\lambda x_iy_i} * \sum_{i=1}^ne^{\lambda x_iy_i} - \left(\sum_{i=1}^nx_iy_ie^{\lambda x_iy_i}\right)^2}{\left(\sum_{j=1}^ne^{\lambda x_jy_j}\right)^2}. $$

References

Efron B. (1981) Nonparametric standard errors and confidence intervals. Canadian Journal of Statistics, 9(2): 139--158.

Owen A. B. (2001). Empirical likelihood. Chapman and Hall/CRC Press.

See Also

perm.elcortest, correl, permcor

Examples

Run this code
el.cor.test( iris[, 1], iris[, 2], 0 )$info
eel.cor.test( iris[, 1], iris[, 2], 0 )$info

Run the code above in your browser using DataLab