Learn R Programming

gmm (version 0.1-0)

gmm: Generalized method of moment estimation

Description

Function to estimate a vector of parameters based on moment conditions using the GMM method of Hansen(82). It is a preliminary version. It has been tested with monte carlo experiment but it may have bugs. Comments are welcome.

Usage

gmm(g,t0,x,grad=NULL,type=c("twoStep","cue","iterative"),kernel=c("Quadratic Spectral", 
    "Truncated", "Bartlett", "Parzen", "Tukey-Hanning"),iid=FALSE,crit=10e-7,
     itermax=100,algo=c("Nelder-Mead", "BFGS", "CG", "L-BFGS-B", "SANN"), vcov=HAC, ...)

Arguments

g
A function of the form $g(\theta,x)$ and which returns a $n \times q$ matrix of time series. This series is than used to build the q sample moment conditions.
t0
A $k \times 1$ vector of starting values
x
The matrix or vector of data from which the function $g(\theta,x)$ is computed
grad
A function of the form $G(\theta,x)$ which returns a $q\times k$ matrix of derivatives of $\bar{g}(\theta)$. By default, the numerical algorithm numericDeriv is used.
type
The GMM method: "twostep" is the two step GMM proposed by Hansen(1982) and the cue and iterative are respectively the continuous updated and the iterative GMM proposed by Hansen, Eaton et Yaron (1996)
kernel
type of kernel used to compute the covariance matrix of the vector of moment conditions. See HAC for more details
iid
Hypothesis on the properties of x. By default, x is a weakly dependant time series
crit
The level of precision required for the iterative GMM
itermax
The maximum number of iterations for the iterative GMM
algo
The numerical algorithm for the optimization problem. See optim for more details.
vcov
The method used to compute de covariance matrix. By default it is the HAC. For now it is the only option. But in a futur version, more choices will be available.
...
More option to give to the HAC procedure.

Value

  • 'gmm' returns an object of 'class' '"gmm"'

    The functions 'summary' is used to obtain and print a summary of the results. It also compute the J-test of overidentying restriction The object of class "gmm" is a list containing:

    par: $k\times 1$ vector of parameters

    vcov: the covariance matrix

    objective: the value of the objective function $\bar{g}'var(\bar{g})^{-1}\bar{g}$

Details

weightsAndrews2 and bwAndrews2 are simply modified version of weightsAndrews and bwAndrews from the package sandwich. The modifications have been made so that the argument x can be a matrix instead of an object of class lm or glm. The details on how is works can be found on the sandwich manual.

References

Zeileis A (2006), Object-oriented Computation of Sandwich Estimators. Journal of Statistical Software, 16(9), 1--16. URL http://www.jstatsoft.org/v16/i09/.

Andrews DWK (1991), Heteroskedasticity and Autocorrelation Consistent Covariance Matrix Estimation. Econometrica, 59, 817--858.

Hansen, L.P. (1982), Large Sample Properties of Generalized Method of Moments Estimators. Econometrica, 50, 1029-1054,

Hansen, L.P. and Heaton, J. and Yaron, A.(1996), Finit-Sample Properties of Some Alternative GMM Estimators. Journal of Business and Economic Statistics, 14 262-280.

Examples

Run this code
g <- function(tet,x)
	{
	n <- nrow(x)
	u <- (x[7:n] - tet[1] - tet[2]*x[6:(n-1)] - tet[3]*x[5:(n-2)])
	f <- cbind(u,u*x[5:(n-2)],u*x[4:(n-3)],u*x[3:(n-4)])
	return(f)
	}

Dg <- function(tet,x)
	{
	n <- nrow(x)
	xx <- cbind(rep(1,(n-6)),x[6:(n-1)],x[5:(n-2)])
        H  <- cbind(rep(1,(n-6)),x[5:(n-2)],x[4:(n-3)],x[3:(n-4)])
	f <- -crossprod(H,xx)/(n-6)
	return(f)
	}
n = 500
phi<-c(.2,.7)
thet <- 0
sd <- .2
x <- matrix(arima.sim(n=n,list(order=c(2,0,1),ar=phi,ma=thet,sd=sd)),ncol=1)
resgmm <- gmm(g,c(0,.3,.6),x,grad=Dg)

Run the code above in your browser using DataLab