lmridge (version 1.2)

kest.lmridge: Computation of Ridge Biasing Parameter \(K\)

Description

The kest function computes different biasing parameters available in the literature proposed by different researchers.

Usage

kest(object, …)
# S3 method for lmridge
kest(object, …)
# S3 method for klmridge
print(x, …)

Arguments

object

An object of class "lmridge" for the kest.

x

An object of class "klmridge" for the print.kest.klmridge.

Not presently used in this implementation.

Value

The function returns the list of following biasing parameter methods proposed by various researchers.

mHKB

By Thisted (1976), \(\frac{((p-2)*\hat{\sigma}^2)}{\sum(\beta^2)}\)

LW

As in lm.ridge of MASS \(\frac{((p-2)*\hat{\sigma}^2*n)}{\sum(\hat{y}^2)}\)

LW76

By Lawless and Wang (1976), \(\frac{p*\hat{\sigma}^22}{\sum(\lambda_j*\hat{\alpha}_j^2)}\)

CV

Value of Cross Validation (CV) for each biasing parameter \(K\), \(CV_k=\frac{1}{n}\sum_{j=1}^n (y_i-X_j \hat{\beta}_{j_K})^2 \).

kCV

Value of biasing parameter at which CV is small.

HKB

By Hoerl and Kennard (1970), \(\frac{p*\hat{\sigma}^2}{\hat{\beta}'\hat{\beta}}\)

kibAM

By Kibria (2003), \(\frac{1}{p}*\sum(\frac{\hat{\sigma}^2}{\hat{\beta}_j^2)}\)

GCV

Value of Generalized Cross Validation (GCV) for each biasing parameter \(K\), \(\frac{(y_i-X_j\hat{\beta}_{J_K})^2}{[n-(1+Trace(H_{R,k}))]^2}\).

kcGCV

Value of biasing parameter at which GCV is small.

DSK

By Dwividi and Shrivastava, (1978), \(\frac{\hat{\sigma}^2}{\hat{\beta}'\hat{\beta}}\)

kibGM

By Kibria (2003), \(\frac{\hat{\sigma}^2}{(\prod(\hat{\alpha}_j^2))^(1/p)}\)

kibMEd

By Kibria (2003), \(median(\frac{\hat{\sigma}^2}{\hat{\alpha}_j^2})\)

KM2

By Muniz and Kibria (2009), \(max[\frac{1}{\sqrt{\frac{\hat{\sigma}^2}{\hat{\alpha}^2_j}}}]\)

KM3

By Muniz and Kibria (2009), \(max[\sqrt{\frac{\hat{\sigma}^2}{\hat{\alpha}_j^2}}]\)

KM4

By Muniz and Kibria (2009), \([\prod\frac{1}{\sqrt{\frac{\hat{\sigma}^2}{\hat{\alpha}_j^2}}}]^\frac{1}{p}\)

KM5

By Muniz and Kibria (2009), \([\prod \sqrt{\frac{\hat{\sigma}^2}{\hat{\alpha}_j^2}}]^{\frac{1}{p}}\)

KM6

By Muniz and Kibria (2009), \(Median[\frac{1}{\sqrt{\frac{\hat{\sigma}^2}{\hat{\alpha}^2_j}}}]\)

KM8

By Muniz et al. (2012), \(max(\frac{1}{\sqrt{\frac{\lambda_{max} \hat{\sigma}^2} {(n-p)\hat{\sigma}^2+\lambda_{max}\hat{\alpha}^2_j}}})\)

KM9

By Muniz et al. (2012), \(max[\sqrt{\frac{\lambda_{max}\hat{\sigma}^2}{(n-p)\hat{\sigma}^2}+\lambda_{max}\hat{\alpha}^2_j}]\)

KM10

By Muniz et al. (2012), \([\prod(\frac{1}{\sqrt{\frac{\lambda_{max}\hat{\sigma}^2}{(n-p)\hat{\sigma}^2+\lambda_{max}\hat{\alpha}^2_j}}})]^{\frac{1}{p}}\)

KM11

By Muniz et al. (2012), \([\prod(\sqrt{\frac{\lambda_{max}\hat{\sigma}^2}{(n-p) \hat{\sigma}^2+\lambda_{max}\hat{\alpha}^2_j}})^{\frac{1}{p}}\)

KM12

By Muniz et al., \(Median[\frac{1}{\sqrt{\frac{\lambda_{max}\hat{\sigma}^2}{(n-p)\hat{\sigma}^2+\lambda_{max}\hat{\alpha}^2_j}}}]\)

KD

By Dorugade and Kashid (2012), \(0, \frac{p\hat{\sigma}^2}{\hat{\alpha}'\hat{\alpha}}-\frac{1}{n(VIF_j)_{max}}\)

KAD4

By Dorugade and Kashid (2012), \(HM[\frac{2p}{\lambda_{max}} \sum(\frac{\hat{\sigma}^2}{\hat{\alpha}^2_j})]\)

alphahat

The OLS estimator in canonical form, i.e., \(\hat{\alpha}=(P'X'XP)^{-1}X'^*y\), where \(X^*=XP\) \(P\) is eigenvector of \(X'X\).

%\item{P}{ \eqn{P} is eigenvector of \eqn{X'X}} %\item{xstar}{} % \item{EV}{Eigenvalue for X'X matrix}

Details

The kest function computes different biasing parameter for the ordinary linear ridge regression. All these methods are already available in the literature and proposed by various authors. See reference section.

References

Dorugade, A. and Kashid, D. (2010). Alternative Method for Choosing Ridge Parameter for Regression. Applied Mathematical Sciences, 4(9), 447-456.

Dorugade, A. (2014). New Ridge Parameters for Ridge Regression. Journal of the Association of Arab Universities for Basic and Applied Sciences, 15, 94-99. Dorugade, 2014.

Hoerl, A. E., Kennard, R. W., and Baldwin, K. F. (1975). Ridge Regression: Some Simulation. Communication in Statistics, 4, 105-123. Hoer et al., 1975.

Hoerl, A. E. and Kennard, R. W., (1970). Ridge Regression: Biased Estimation of Nonorthogonal Problems. Technometrics, 12, 55-67. Hoerl and Kennard, 1970.

Imdad, M. U. Addressing Linear Regression Models with Correlated Regressors: Some Package Development in R (Doctoral Thesis, Department of Statistics, Bahauddin Zakariya University, Multan, Pakistan), 2017.

Kibria, B. (2003). Performance of Some New Ridge Regression Estimators. Communications in Statistics-Simulation and Computation, 32(2), 491-435. Kibria, 2003.

Lawless, J., and Wang, P. (1976). A Simulation Study of Ridge and Other Regression Estimators. Communications in Statistics-Theory and Methods, 5(4), 307-323. Lawless and Wang, 1976.

Muniz, G., and Kibria, B. (2009). On Some Ridge Regression Estimators: An Empirical Comparisons. Communications in Statistics-Simulation and Computation, 38(3), 621-630. Muniz and Kibria, 2009.

Muniz, G., Kibria, B., Mansson, K., and Shukur, G. (2012). On developing Ridge Regression Parameters: A Graphical Investigation. SORT-Statistics and Operations Research Transactions, 36(2), 115--138.

Thisted, R. A. (1976). Ridge Regression, Minimax Estimation and Empirical Bayes Methods. Technical Report 28, Division of Biostatistics, Stanford University, California.

Venables, W. N. and Ripley, B. D. (2002). Modern Applied Statistics with S. Springer New York, 4th edition, ISBN 0-387-95457-0.

See Also

The ridge model fitting lmridge, Ridge Var-Cov matrix vcov

Examples

# NOT RUN {
mod <- lmridge(y~., as.data.frame(Hald), K = seq(0, 0.2, 0.001))
kest(mod)

## GCV values
kest(mod)$GCV

## minimum GCV value
kest(mod)$kGCV

## Hoerl and Kennard (1970)
kest(mod)$HKB

# }