bayesMixModel(z, normNull=c(), expNeg=c(), expPos=c(), gamNeg=c(), gamPos=c(), sdNormNullInit=c(), rateExpNegInit=c(), rateExpPosInit=c(), shapeGamNegInit=c(), scaleGamNegInit=c(), shapeGamPosInit=c(), scaleGamPosInit=c(), piInit, classificationsInit, dirichletParInit=1, shapeDir=1, scaleDir=1, weightsPrior="FDD", sdAlpha, shapeNorm0=c(), scaleNorm0=c(), shapeExpNeg0=c(), scaleExpNeg0=c(), shapeExpPos0=c(), scaleExpPos0=c(), shapeGamNegAlpha0=c(), shapeGamNegBeta0=c(), scaleGamNegAlpha0=c(), scaleGamNegBeta0=c(), shapeGamPosAlpha0=c(), shapeGamPosBeta0=c(), scaleGamPosAlpha0=c(), scaleGamPosBeta0=c(), itb, nmc, thin, average="mean",sdShape)
weightsPrior="FDD"
.
average="mean"
or average="median"
.
Note: For the allocation to components, results are given for posterior mean, median and maximum density regardless of the specification.
MixModelBayes-class
storing results, data,
priors, initial values and information about convergence.
plotChains
is therefore urgently recommended.
Iterations during which one of the chains has not yet reached stationarity should not be taken into account for analysis
and can be excluded by setting an appropriate burn-in value itb
.
Autocorrelation between subsequent chain values can be reduced by thinning the chain, setting an appropriate value for thin
.
To ensure a sufficient number of iterations for the chains after the burn-in, nmc
should be increased when the thinning is increased.
The standard deviations of the proposal distribution in a Metropolis-Hastings step should be tuned to achieve a medium-level acceptance rate (e.g., 0.3-0.7):
A very low acceptance rate would cause a long running time of the algorithm, while a very high acceptance rate
typically leads to autocorrelation between the values of the chain. Acceptance is documented for each iteration in the chains
slot of objects of class MixModelBayes-class
.
plotChains
, MixModelBayes-class
set.seed(1000)
z <- c(rnorm(1000, 0, 0.5), rnorm(1000, 0, 1))
mm <- bayesMixModel(z, normNull=1:2, sdNormNullInit=c(0.1, 0.2),
piInit=c(1/2, 1/2), shapeNorm0=c(1, 1), scaleNorm0=c(1, 1),
shapeExpNeg0=c(), scaleExpNeg0=c(),
shapeExpPos0=c(), scaleExpPos0=c(), sdAlpha=1, itb=100, nmc=1000, thin=10)
mm
plotComponents(mm)
plotChains(mm, chain="pi")
z <- c(rnorm(200, 0, 1), rnorm(200, 0, 5), rexp(200, 0.1), -rexp(200, 0.1))
mm <- bayesMixModel(z, normNull=1:2, gamNeg=3, gamPos=4,
sdNormNullInit=c(1, 1),
shapeGamNegInit=1, scaleGamNegInit=1, shapeGamPosInit=1, scaleGamPosInit=1,
shapeNorm0=c(1,3), scaleNorm0=c(1,3), sdAlpha=1,
shapeGamNegAlpha0=1, shapeGamNegBeta0=1,
scaleGamNegAlpha0=1, scaleGamNegBeta0=1,
shapeGamPosAlpha0=1, shapeGamPosBeta0=1,
scaleGamPosAlpha0=1, scaleGamPosBeta0=1, sdShape=0.025,
itb=100, nmc=1000, thin=10)
mm
plotComponents(mm)
plotChains(mm, chain="pi")
Run the code above in your browser using DataLab