Skip to contents

Function for Interrater Reliability.

Usage

agreement(
  data,
  vars,
  sft = FALSE,
  heatmap = FALSE,
  heatmapDetails = FALSE,
  wght = "unweighted",
  exct = FALSE,
  kripp = FALSE,
  krippMethod = "nominal"
)

Arguments

data

The data as a data frame. Each row represents a case/subject, and columns represent different raters/observers.

vars

Variables representing different raters/observers. Each variable should contain the ratings/diagnoses given by each observer for the same set of cases.

sft

Show frequency tables for each rater and cross-tabulation tables for pairwise comparisons.

heatmap

Show agreement heatmap visualization with color-coded agreement levels.

heatmapDetails

Show detailed heatmap with kappa values and confidence intervals for all rater pairs.

wght

Weighting scheme for kappa analysis. Use 'squared' or 'equal' only with ordinal variables. Weighted kappa accounts for the degree of disagreement.

exct

Use exact method for Fleiss' kappa calculation with 3 or more raters. More accurate but computationally intensive.

kripp

Calculate Krippendorff's alpha, a generalized measure of reliability for any number of observers and data types.

krippMethod

Measurement level for Krippendorff's alpha calculation. Choose based on your data type.

Value

A results object containing:

results$todoa html
results$overviewTablea table
results$kappaTablea table
results$krippTablea table
results$heatmapPlotan image
results$frequencyTablesa html

Tables can be converted to data frames with asDF or as.data.frame. For example:

results$overviewTable$asDF

as.data.frame(results$overviewTable)

Examples

# \donttest{
# example will be added
# }