Learn R Programming

survMisc (version 0.4.6)

ic: Information criterion

Description

Information Criterion for a fitted model.

Usage

BIC(object, ...)

## S3 method for class 'coxph': BIC(object, ...)

AIC(object, ..., k = 2)

## S3 method for class 'coxph': AIC(object, ..., k = 2)

AICc(object, ...)

## S3 method for class 'coxph': AICc(object, ..., k = 2)

Arguments

object
An object of class coxph
...
Not implemented
k
The weight of the equivalent degrees of freedom (edf) of the AIC formula

Value

  • A named vector with [object Object],[object Object]

Details

Given a set of candidate models for the same data, the preferred model is the one with the minimum IC value. The Akaike information criterion, AIC, is given by $$AIC = k.edf -2 \ln L$$ Where $edf$ is the equivalent degrees of freedom (i.e., equivalent to the number of free parameters in the model) and $L$ is the model likelihood. $k$ is a constant, which is $=2$ for the traditional AIC. AIC corrected for finite sample size $n$, AICc, is $$AICc = AIC + \frac{k.edf(edf+1)}{n-edf-1}$$ where $n$ is the sample size. Thus there is a greater penalty for more parameters. The Bayesian information criterion is $$BIC = \ln n.edf -2 \ln L$$ This penalises models with more parameters to a greater extent.