﻿ AgreeStat/360: computing agreement coefficients (Conger's kappa, Fleiss' kappa, Gwet's AC1/AC2, Krippendorff's alpha, and more) for 3 raters or more, by sub-group
 AgreeStat Analytics

## 4-Rater Agreement: Unweighted Analysis of Raw Scores by Subgroup

Input Data

The dataset shown below is included in one of the example worksheets of AgreeStat 360 and can be downloaded from the program. It contains the ratings that 4 raters assigned to 12 units. None of the 4 raters rated all 12 units.  Therefore the dataset contains several missing ratings.

The objective is to compute the unweighted extent of agreement among the 4 raters, separately by gender, and for both genders combined, using the  using AgreeStat360 for Excel/Windows.

Analysis with AgreeStat/360

To see how AgreeStat360 processes this dataset to produce various agreement coefficients, please play the video below.  This video can also be watched on youtube.com for more clarity if needed.

Results

The output that AgreeStat360 produces is shown below and contains a separate analysis for females, males and both genders combined. Each of analysis contains the following 2 parts:

• Summary data: The first part of this output shows the distribution of subjects by rater and category.  The row marginal totals show the number of subjects each rater rated, while the column marginal averages show on average how many subjects each rater classified into each category.

• Unweighted analysis: Six agreement coefficients are calculated, including Conger's kappa, Gwet's AC1, and more. Each agreement coefficient is associated with precision measures calculated with respect to subjects (i.e. raters are fixed and do not constitue a source of variation), and with respect to both subjects and raters. These precision measures are the standard error, the 95% confidence interval and the p-value.