Last chance! 50% off unlimited learning
Sale ends in
mmer
when MORE than 1 variance component needs to be estimated through the use of the average information (AI) algorithm.
AI(y, X=NULL, ZETA=NULL, R=NULL, draw=TRUE, REML=TRUE, silent=FALSE, iters=50, constraint=TRUE, init=NULL, sherman=FALSE, che=TRUE, EIGEND=FALSE, Fishers=FALSE, gss=TRUE, forced=NULL)
.
y = Xb + Zu + e
.
b ~ N[b.hat, 0] ............zero variance because is a fixed term
u ~ N[0, K*sigma(u)] .......where: K*sigma(u) = G
e ~ N[0, I*sigma(e)] .......where: I*sigma(e) = R
y ~ N[Xb, var(Zu+e)] ......where;
var(y) = var(Zu+e) = ZGZ+R = V which is the phenotypic variance
.
The function allows the user to specify the incidence matrices with their respective variance-covariance matrix in a 2 level list structure. For example imagine a mixed model with the following design:
.
fixed = only intercept.....................b ~ N[b.hat, 0]
random = GCA1 + GCA2 + SCA.................u ~ N[0, G]
.
where G is:
.
|K*sigma(gca1).....................0..........................0.........| |.............0.............S*sigma(gca2).....................0.........| = G
|.............0....................0......................W*sigma(sca)..|
.
The likelihood function optimized in this algorithm is:
.
logL = -0.5 * (log( | V | ) + log( | X'VX | ) + y'Py
.
where: | | refers to the derminant of a matrix
.
The algorithm can be summarized in the next steps:
.
1) provide initial values for the variance components
2) estimate the phenotypic variance matrix V = ZGZ + R
3) obtain Vinv by inverting V
4) obtain the projection matrix P = Vinv - [Vinv X (X'V-X)- X Vinv]
5) evaluate the logLikelihood as shown above
6) fill the average information matrix (AI) with equation provided in Gilmour et al. (1995)
7) obtain AI.inv by inverting AI (the average information matrix)
8) calculate scores by first derivatives refer as "B" in Gilmour et al. (1995)
9) update the values of variance components by : k(i+1) = k(i) + [ B(i) * AI.inv ]
10) steps are repeated in a while loop until convergence is reached, the likelihood doesn't increase anymore.
Covarrubias-Pazaran G (2016) Genome assisted prediction of quantitative traits using the R package sommer. PLoS ONE 11(6): doi:10.1371/journal.pone.0156744
Gilmour et al. 1995. Average Information REML: An efficient algorithm for variance parameter estimation in linear mixed models. Biometrics 51(4):1440-1450.
Lee et al. 2015. EIGEND: An efficient algorithm for multivariate linear mixed model analysis based on genomic information. Cold Spring Harbor. doi: http://dx.doi.org/10.1101/027201.
mmer
and mmer2
####=========================================####
#### For CRAN time limitations most lines in the
#### examples are silenced with one '#' mark,
#### remove them and run the examples
####=========================================####
####=========================================####
#### breeding values with 3 variance components
####=========================================####
####=========================================####
## Import phenotypic data on inbred performance
## Full data
####=========================================####
data(cornHybrid)
hybrid2 <- cornHybrid$hybrid # extract cross data
A <- cornHybrid$K # extract the var-cov K
y <- hybrid2$Yield
X1 <- model.matrix(~ Location, data = hybrid2);dim(X1)
Z1 <- model.matrix(~ GCA1 -1, data = hybrid2);dim(Z1)
Z2 <- model.matrix(~ GCA2 -1, data = hybrid2);dim(Z2)
Z3 <- model.matrix(~ SCA -1, data = hybrid2);dim(Z3)
####=========================================####
#### Realized IBS relationships for set of parents 1
####=========================================####
K1 <- A[levels(hybrid2$GCA1), levels(hybrid2$GCA1)]; dim(K1)
####=========================================####
#### Realized IBS relationships for set of parents 2
####=========================================####
K2 <- A[levels(hybrid2$GCA2), levels(hybrid2$GCA2)]; dim(K2)
####=========================================####
#### Realized IBS relationships for cross
#### (as the Kronecker product of K1 and K2)
####=========================================####
S <- kronecker(K1, K2) ; dim(S)
rownames(S) <- colnames(S) <- levels(hybrid2$SCA)
ETA <- list(list(Z=Z1, K=K1), list(Z=Z2, K=K2), list(Z=Z3, K=S))
####=========================================####
#### run the next line, it was ommited for CRAN time limitations
####=========================================####
#ans <- AI(y=y, ZETA=ETA)
#ans$var.comp
Run the code above in your browser using DataLab