# bcPower

0th

Percentile

##### Box-Cox, Box-Cox with Negatives Allowed, Yeo-Johnson and Basic Power Transformations

Transform the elements of a vector or columns of a matrix using, the Box-Cox, Box-Cox with negatives allowed, Yeo-Johnson, or simple power transformations.

Keywords
regression
##### Usage
bcPower(U, lambda, jacobian.adjusted=FALSE, gamma=NULL)bcnPower(U, lambda, jacobian.adjusted = FALSE, gamma)yjPower(U, lambda, jacobian.adjusted = FALSE)basicPower(U,lambda, gamma=NULL)
##### Arguments
U

A vector, matrix or data.frame of values to be transformed

lambda

Power transformation parameter with one element for eacul column of U, usuallly in the range from $-2$ to $2$, or if U

If TRUE, the transformation is normalized to have Jacobian equal to one. The default FALSE is almost always appropriate

gamma

For bcPower or basicPower, the transformation is of U + gamma, where gamma is a positive number called a start that must be large enough so that U + gamma is strictly positive. For the bcnPower, Box-cox power with negatives allowed, see the details below.

##### Details

The Box-Cox family of scaled power transformations equals $((U + \gamma)^{\lambda}-1)/\lambda$ for $\lambda \neq 0$, and $\log(U)$ if $\lambda =0$. If $\gamma$ is not specified, it is set equal to zero. U + gamma must be strictly positive to use this family.

The Box-Cox family with negatives allowed was proposed by Hawkins and Weisberg (2017). It is the Box-Cox power transformation of $z = .5 * (y + (y^2 + \gamma^2)^{1/2})$, where $\gamma$ is strictly positive if $y$ includes negative values and non-negative otherwise. The value of $z$ is always positive. The bcnPower transformations behave very similarly to the bcPower transformations, including much less bias than is introduced by setting the parameter $\gamma$ to be non-zero in the Box-Cox family.

If family="yeo.johnson" then the Yeo-Johnson transformations are used. This is the Box-Cox transformation of $U+1$ for nonnegative values, and of $|U|+1$ with parameter $2-\lambda$ for $U$ negative.

The basic power transformation returns $U^{\lambda}$ if $\lambda$ is not zero, and $\log(\lambda)$ otherwise for $U$ strictly positive.

If jacobian.adjusted is TRUE, then the scaled transformations are divided by the Jacobian, which is a function of the geometric mean of $U$ for skewPower and yjPower and of $U + gamma$ for bcPower. With this adjustment, the Jacobian of the transformation is always equal to 1.

Missing values are permitted, and return NA where ever U is equal to NA.

##### Value

Returns a vector or matrix of transformed values.

##### References

Fox, J. and Weisberg, S. (2011) An R Companion to Applied Regression, Second Edition, Sage.

Hawkins, D. and Weisberg, S. (2017) Combining the Box-Cox Power and Generalized Log Transformations to Accomodate Negative Responses In Linear and Mixed-Effects Linear Models, submitted for publication.

Weisberg, S. (2014) Applied Linear Regression, Fourth Edition, Wiley Wiley, Chapter 7.

Yeo, In-Kwon and Johnson, Richard (2000) A new family of power transformations to improve normality or symmetry. Biometrika, 87, 954-959.

powerTransform, testTransform

• bcPower
• bcnPower
• yjPower
• basicPower
##### Examples
# NOT RUN {
U <- c(NA, (-3:3))
# }
# NOT RUN {
bcPower(U, 0)
# }
# NOT RUN {
# produces an error as U has negative values
bcPower(U, 0, gamma=4)