
gmm(g, x, t0 = NULL, gradv = NULL, type = c("twoStep","cue","iterative"),
wmatrix = c("optimal","ident"), vcov = c("HAC","iid"),
kernel = c("Quadratic Spectral","Truncated", "Bartlett",
"Parzen", "Tukey-Hanning"), crit = 10e-7, bw = bwAndrews2,
prewhite = FALSE, ar.method = "ols", approx = "AR(1)", tol = 1e-7,
itermax = 100, optfct = c("optim","optimize","nlminb"), model=TRUE, X=FALSE, Y=FALSE, TypeGmm = "baseGmm", ...)
numericDeriv
is used. It is of course strongly suggested to provide thisHAC
for more details)bwAndrews2
which is proposed by Andrews (1991). The alternative is bwNeweyWest2
of NeTRUE
or greater than 0 a VAR model of order as.integer(prewhite)
is fitted via ar
with method "ols"
and demean = FALSE
.method
argument passed to ar
for prewhitening.bwAndrews2
.tol
are used for computing the covariance matrix, all other weights are treated as 0.TRUE
the corresponding components of the fit (the model frame, the model matrix, the response) are returned if g is a formula.getModel
. It allows developers to extand the package and create other GMM methods.optim
.The functions 'summary' is used to obtain and print a summary of the results. It also compute the J-test of overidentying restriction
The object of class "gmm" is a list containing at least:
terms
object used when g is a formula.weightsAndrews2
and bwAndrews2
are simply modified version of weightsAndrews
and bwAndrews
from the package sandwich. The modifications have been made so that the argument x can be a matrix instead of an object of class lm or glm. The details on how is works can be found on the sandwich manual.If we want to estimate a model like $Y_t = \theta_1 + X_{2t} \theta_2 + \cdots + X_{k}\theta_k + \epsilon_t$ using the moment conditions $Cov(\epsilon_tH_t)=0$, where $H_t$ is a vector of $Nh$ instruments, than we can define "g" like we do for lm
. We would have $g = y ~\tilde{}~ x2+x3+ \cdots +xk$ and the argument "x" above would become the matrix H of instruments. As for lm
, $Y_t$ can be a $Ny \times 1$ vector which would imply that $k=Nh \times Ny$. The intercept is included by default so you do not have to add a column of ones to the matrix $H$. You do not need to provide the gradiant in that case since in that case it is embedded in gmm
. The intercept can be removed by adding -1 to the formula. In that case, the column of ones need to be added manually to H.
The following explains the last example bellow. Thanks to Dieter Rozenich, a student from the Vienna Universtiy of Economics and Business Administration. He suggested that it would help to understand the implementation of the jacobian.
For the two parameters of a normal distribution $(\mu,\sigma)$ we have the following three moment conditions:
Andrews DWK (1991), Heteroskedasticity and Autocorrelation Consistent Covariance Matrix Estimation. Econometrica, 59, 817--858.
Newey WK & West KD (1987), A Simple, Positive Semi-Definite, Heteroskedasticity and Autocorrelation Consistent Covariance Matrix. Econometrica, 55, 703--708.
Newey WK & West KD (1994), Automatic Lag Selection in Covariance Matrix Estimation. Review of Economic Studies, 61, 631-653.
Hansen, L.P. (1982), Large Sample Properties of Generalized Method of Moments Estimators. Econometrica, 50, 1029-1054,
Hansen, L.P. and Heaton, J. and Yaron, A.(1996), Finit-Sample Properties of Some Alternative GMM Estimators. Journal of Business and Economic Statistics, 14 262-280.
## CAPM test with GMM
data(Finance)
r <- Finance[1:300, 1:10]
rm <- Finance[1:300, "rm"]
rf <- Finance[1:300, "rf"]
z <- as.matrix(r-rf)
t <- nrow(z)
zm <- rm-rf
h <- matrix(zm, t, 1)
res <- gmm(z ~ zm, x = h)
summary(res)
## linear tests can be performed using linear.hypothesis from the car package
## The CAPM can be tested as follows:
library(car)
linear.hypothesis(res,cbind(diag(10),matrix(0,10,10)),rep(0,10))
# The CAPM of Black
g <- function(theta, x) {
e <- x[,2:11] - theta[1] - (x[,1] - theta[1]) %*% matrix(theta[2:11], 1, 10)
gmat <- cbind(e, e*c(x[,1]))
return(gmat) }
x <- as.matrix(cbind(rm, r))
res_black <- gmm(g, x = x, t0 = rep(0, 11))
summary(res_black)$coefficients
## APT test with Fama-French factors and GMM
f1 <- zm
f2 <- Finance[1:300, "hml"] - rf
f3 <- Finance[1:300, "smb"] - rf
h <- cbind(f1, f2, f3)
res2 <- gmm(z ~ f1 + f2 + f3, x = h)
coef(res2)
summary(res2)$coefficients
## The following example has been provided by Dieter Rozenich (see details).
# It generates normal random numbers and uses the GMM to estimate
# mean and sd.
#-------------------------------------------------------------------------------
# Random numbers of a normal distribution
# First we generate normally distributed random numbers and compute the two parameters:
n <- 1000
x <- rnorm(n, mean = 4, sd = 2)
# Implementing the 3 moment conditions
g <- function(tet, x)
{
m1 <- (tet[1] - x)
m2 <- (tet[2]^2 - (x - tet[1])^2)
m3 <- x^3 - tet[1]*(tet[1]^2 + 3*tet[2]^2)
f <- cbind(m1, m2, m3)
return(f)
}
# Implementing the jacobian
Dg <- function(tet, x)
{
jacobian <- matrix(c( 1, 2*(-tet[1]+mean(x)), -3*tet[1]^2-3*tet[2]^2,0, 2*tet[2],-6*tet[1]*tet[2]), nrow=3,ncol=2)
return(jacobian)
}
# Now we want to estimate the two parameters using the GMM.
gmm(g, x, c(0, 0), grad = Dg)
Run the code above in your browser using DataLab