VGAM (version 1.1-1)

hdeffsev: Hauck-Donner Effects: Severity Measures

Description

Computes the severity of the Hauck-Donner effect for each regression coefficient of a VGLM regression.

Usage

hdeffsev(x, y, dy, ddy, allofit = FALSE, tol0 = 0.1,
         severity.table = c("None", "Faint", "Weak", "Moderate",
                            "Strong", "Extreme", "Undetermined"))

Arguments

x, y

Numeric vectors; x are the estimates, and y are the Wald statistics.

dy, ddy

Numeric vectors; the first and second derivatives of the Wald statistics. They can be computed by hdeff.

allofit

Logical. If TRUE then other quantities are returned in a list. The default is a vector with elements selected from the argument severity.table.

severity.table

Character vector with 7 values. The last value is used for initialization. Usually users should not assign anything to arguments severity.table or tol0.

tol0

Numeric. Any estimate whose absolute value is less than tol0 is assigned the first value of the argument severity.table, i.e., none. This is to handle a singularity at the origin: the estimates might be extremely close to 0.

Value

By default this function returns a labelled vector with elements selected from severity.table. If allofit = TRUE then Yee (2018) gives details about the other list components: a quantity called zeta is the normal line projected onto the x-axis, and its first derivative gives additional information about the position of the estimate along the curve.

Details

This function is rough-and-ready. It is possible to use the first two derivatives obtained from hdeff to categorize the severity of the the Hauck-Donner effect (HDE). It is effectively assumed that, starting at the origin and going right, the curve is made up of a convex segment followed by a concave segment and then the convex segment. Midway in the concave segment the derivative is 0, and beyond that the HDE is really manifest because the derivative is negative.

For "none" the estimate lies on the convex part of the curve near the origin, hence there is no HDE at all.

For "faint" and "weak" the estimate lies on the concave part of the curve but the Wald statistic is still increasing as estimate gets away from 0, hence it is only a mild HDE.

For "moderate", "strong" and "extreme" the Wald statistic is decreasing as the estimate gets away from 0, hence it really does exhibit the HDE. It is recommended that lrt.stat be used to compute LRT p-values, as they do not suffer from the HDE.

See Also

seglines, hdeff.

Examples

Run this code
# NOT RUN {
deg <- 4  # myfun is a function that approximates the HDE
myfun <- function(x, deriv = 0) switch(as.character(deriv),
  '0' = x^deg * exp(-x),
  '1' = (deg * x^(deg-1) - x^deg) * exp(-x),
  '2' = (deg * (deg-1) * x^(deg-2) - 2*deg * x^(deg-1) + x^deg) * exp(-x))

xgrid <- seq(0, 10, length = 101)
ansm <- hdeffsev(xgrid, myfun(xgrid), myfun(xgrid, deriv = 1),
                 myfun(xgrid, deriv = 2), allofit = TRUE)
digg <- 4
cbind(severity = ansm$sev, 
      fun      = round(myfun(xgrid), digg),
      deriv1   = round(myfun(xgrid, deriv = 1), digg),
      deriv2   = round(myfun(xgrid, deriv = 2), digg),
      zderiv1  = round(1 + (myfun(xgrid, deriv = 1))^2 +
                       myfun(xgrid, deriv = 2) * myfun(xgrid), digg))
# }

Run the code above in your browser using DataCamp Workspace