The degree of agreement among raters. It gives a score of how much homogeneity, or consensus, there is in the ratings given by the appraisers. Also known as inter-rater reliability, or concordance. Options to measure inter-rater agreement include: joint-probability of agreement, Cohen’s kappa and the related Fleiss’ kappa, inter-rater correlation, concordance correlation coefficient and intra-class correlation.
Inter-rater agreement
« Back to Glossary Index« Back to Glossary Index
Decision Type:
Method Category:
Categories: Terms
Synonyms:
inter-rater reliability, concordance
Leave a Reply