kappam_vanbelle: Agreement between two groups of raters
Description
This function expands upon Cohen's and Fleiss' Kappa as measures for
interrater agreement while taking into account the heterogeneity within each
group.
Usage
kappam_vanbelle(
ratings,
refIdx,
ratingScale = NULL,
weights = c("unweighted", "linear", "quadratic"),
conf.level = 0.95
)
Value
list. kappa agreement between two groups of raters
Arguments
- ratings
matrix of subjects x raters for both groups of raters
- refIdx
numeric. indices of raters that constitute the reference group.
Can also be all negative to define rater group by exclusion.
- ratingScale
character vector of the levels for the rating. Or NULL
.
- weights
optional weighting schemes: "unweighted"
,
"linear"
,"quadratic"
- conf.level
confidence level for interval estimation
Details
Data need to be stored with raters in columns.
References
Vanbelle, S., Albert, A. Agreement between Two Independent Groups
of Raters. Psychometrika 74, 477–491 (2009).
tools:::Rd_expr_doi("10.1007/s11336-009-9116-1")
Examples
Run this code# compare student ratings with ratings of 11 experts
kappam_vanbelle(SC_test, refIdx = 40:50)
Run the code above in your browser using DataLab