Learn R Programming

bayess (version 1.6)

hmhmm: Estimation of a hidden Markov model with 2 hidden and 4 observed states

Description

This function implements a Metropolis within Gibbs algorithm that produces a sample on the parameters \(p_{ij}\) and \(q^i_j\) of the hidden Markov model (Chapter 7). It includes a function likej that computes the likelihood of the times series using a forward-backward algorithm.

Usage

hmhmm(M = 100, y)

Value

BigR

matrix of the iterated values returned by the MCMC algorithm containing \(p_{11}\) and \(p_{22}\), transition probabilities, and \(q^1\) and \(q^2\), vector of probabilities for both latent states

olike

sequence of the log-likelihoods produced by the MCMC sequence

Arguments

M

Number of Gibbs iterations

y

times series to be modelled by a hidden Markov model

Details

The Metropolis-within-Gibbs step involves Dirichlet proposals with a random choice of the scale between 1 and 1e5.

Examples

Run this code
res=hmhmm(M=500,y=sample(1:4,10,rep=TRUE))
plot(res$olike,type="l",main="log-likelihood",xlab="iterations",ylab="")

Run the code above in your browser using DataLab