Learn R Programming

MARSS (version 3.2)

MARSSkem: Maximum Likelihood Estimation for Multivariate Autoregressive State-Space Models

Description

MARSSkem() performs maximum-likelihood estimation, using an EM algorithm for constrained and unconstrained MARSS models. This is one of the base functions in the MARSS-package.

Usage

MARSSkem(MLEobj)

Arguments

MLEobj
An object of class marssMLE.

Value

  • The marssMLE object which was passed in, with additional components:
  • methodString "kem".
  • kfKalman filter output.
  • iter.recordIf MLEobj$control$trace = TRUE, a list with par = a record of each estimated parameter over all EM iterations and logLik = a record of the log likelikelihood at each iteration.
  • numIterNumber of iterations needed for convergence.
  • convergenceDid estimation converge successfully? [object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
  • logLikLog-likelihood.
  • statesState estimates from the Kalman filter.
  • states.seConfidence intervals based on state standard errors, see caption of Fig 6.3 (p. 337) Shumway & Stoffer.
  • errorsAny error messages.

Discussion

To ensure that the global maximum-likelihood values are found, it is recommended that for your final fit initial parameter values be set using Monte Carlo initialization (MLEobj$control$MCInit = TRUE), particularly if the model is not a good fit to the data. This requires more compuation time, but reduces the chance of the algorithm terminating at a local maximum and not reaching the true MLEs. For many models and for draft analyses, this is unnecessary, but answers should be checked using an initial conditions search before reporting final values. MARSSkem() calls a Kalman filter/smoother (MARSSkf) for hidden state estimation. The algorithm allows two options for the initial state conditions: fixed but unknown or a prior. In the first case, x0 (whether at t=0 or t=1) is treated as fixed but unknown (estimated); in this case, fixed$V0=0 and x0 is estimated. This is the default behavior. In the second case, the initial conditions are specified with a prior and V0!=0. In the later case, x0 or V0 may be estimated. MARSS will allow you to try to estimate both, but many researchers have noted that this is not robust so you should fix one or the other. If you get errors, it generally means that the solution involves an ill-conditioned matrix. For example, your Q or R matrix is going to a value in which all elements have the same value, for example zero. If for example, you tried to fit a model with fixed and high R matrix and the variance in that R matrix was much higher than what is actually in the data, then you might drive Q to zero. Also if you try to fit a structurally inadequate model, then it is not unusual that Q will be driven to zero. For example, if you fit a model with 1 hidden state trajectory to data that clearly have 2 quite different hidden state trajectories, you might have this problem. Comparing the likelihood of this model to a model with more structural flexibility should reveal that the structually inflexible model is inadequate (much lower likelihood). Convergence testing is done via a combination of two tests. The first test (abstol test) is the test that the change in the absolute value of the log-likelihood from one iteration to another is less than some tolerance value (abstol). The second test (log-log test) is that the slope of a plot of the log of the parameter value or log-likelihood versus the log of the iteration number is less than some tolerance. Both of these must be met to generate the Success! parameters converged output. If you want to circumvent one of these tests, then set the tolerance for the unwanted test to be high. That will guarantee that that test is met before the convergence test you want to use is met. The tolerance for the abstol test is set by control$abstol and the tolerance for the log-log test is set by control$conv.test.slope.tol. Anything over 1 is huge for both of these.

Details

Objects of class marssMLE may be built from scatch but are easier to construct using MARSS with MARSS(..., fit=FALSE). Options for MARSSkem() may be set using MLEobj$control. The commonly used elements of control are follows (see marssMLE: [object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]

References

R. H. Shumway and D. S. Stoffer (2006). Chapter 6 in Time Series Analysis and its Applications. Springer-Verlag, New York. Ghahramani, Z. and Hinton, G. E. (1996) Parameter estimation for linear dynamical systems. Technical Report CRG-TR-96-2, University of Totronto, Dept. of Computer Science. Harvey, A. C. (1989) Chapter 5 in Forecasting, structural time series models and the Kalman filter. Cambridge University Press, Cambridge, UK. The user guide: Holmes, E. E., E. J. Ward, and M. D. Scheuerell (2012) Analysis of multivariate time-series using the MARSS package. NOAA Fisheries, Northwest Fisheries Science Center, 2725 Montlake Blvd E., Seattle, WA 98112 Type RShowDoc("UserGuide",package="MARSS") to open a copy. Holmes, E. E. (2012). Derivation of the EM algorithm for constrained and unconstrained multivariate autoregressive state-space (MARSS) models. RShowDoc("EMDerivation",package="MARSS") to open a copy.

See Also

MARSSmcinit, MARSSkf, marssMLE, MARSSoptim

Examples

Run this code
dat = t(harborSeal)
  dat = dat[2:4,] 
  #you can use MARSS to construct a proper marssMLE object.
  MLEobj = MARSS(dat, model=list(Q="diagonal and equal", U="equal"), fit=FALSE) 
  #Pass this MLEobj to MARSSkem to do the fit.
  kemfit = MARSSkem(MLEobj)

Run the code above in your browser using DataLab