Intraclass Correlation with Multiple
Variables
Sub-group Analysis / 2-Way Random Effects Model
Input Data
The dataset shown below is included in one of the example worksheets of AgreeStat 360 (This video shows how you may import your own data to AgreeStat360). It contains the the ratings that 3 judges J0, J1, and J2 assigned to 10 subjects that belong to 2 groups 1 and 2. Note that the 10 subjects were rated on 3 different variables vglobal, vlad, and vlcx.
The goal is to compute the Intraclass Correlation Coefficients (ICC) separately by group as well as for all groups combined. The ICC will be evaluated under the 2-way random effects model, where the rater and subject effects are random.
Analysis with AgreeStat/360
To see how AgreeStat360 processes this dataset to produce various agreement coefficients, please play the video below. This video can also be watched on youtube.com for more clarity if needed.
Results
The output that AgreeStat360 produces is shown below and has 4 parts. General descriptive statistics represent the first part. The second part shows different variance components, while the third and fourth parts shows the inter-rater and intra-rater reliability estimates respectively, along with their precision measures.
Each of the 4 parts that make up this output, contains statistics separately for Group 1, Group 2, and for both groups combined.
-
Descriptive statistics: The first part of this output displays basic statistics about the number of non-missing observations for each variable as well as their mean values or number subjects.
-
Variance components: This section will typically show the error variance, the subject variance, the rater variance, as well as the subject-rater interaction variance. The interest of these varaiance components lies in their use for calculating the inter-rater and intra-rater reliability coefficients. Using them could help explain the magnitude of the reliability coefficients.
-
Inter-rater reliability: This part shows the inter-rater reliability coefficient for all groups, their confidence intervals and associated p-values. Note that you can modify both the confidence level and the ICC null value to obtain corresponding confidence intervals and p-values. The cells in blue contain dropdown lists contains new confidence levels and ICC null values to choose from.
-
Intra-rater reliability: This part shows the intra-rater reliability coefficient for all groups, their confidence intervals and associated p-values. Again, you can modify both the confidence level and the ICC null value to obtain corresponding confidence intervals and p-values.