0% found this document useful (0 votes)
61 views5 pages

01 HME 712 Week 1 Assoc Agree Kappa Statistic

This document provides an introduction to calculating measures of agreement using kappa statistics. It explains the difference between association and agreement, and gives examples where agreement is important to evaluate. The kappa statistic measures agreement beyond chance for categorical data. An example calculation of kappa is shown. Guidelines for interpreting the value of kappa are presented, with kappa between 0.41-0.60 considered a fair level of agreement.

Uploaded by

jackbane69
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
61 views5 pages

01 HME 712 Week 1 Assoc Agree Kappa Statistic

This document provides an introduction to calculating measures of agreement using kappa statistics. It explains the difference between association and agreement, and gives examples where agreement is important to evaluate. The kappa statistic measures agreement beyond chance for categorical data. An example calculation of kappa is shown. Guidelines for interpreting the value of kappa are presented, with kappa between 0.41-0.60 considered a fair level of agreement.

Uploaded by

jackbane69
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 5

2020-2022

Calculating measures of
agreement: the kappa statistic
HME 712

2020-2022
Prepared and presented by:
BV Girdler-Brown
©University of Pretoria Faculty of Health Sciences, 2020. All rights reserved

Introduction to measures of agreement


Outline for the slide show

In this slide show we will:


 Explain the difference between association and agreement

 Give some examples where we might want to evaluate agreement

 Describe two methods of assessing agreement

 Explain the concept behind the first measure (kappa statistic)

 Provide a guideline for interpreting the kappa result

HME 712 Week 1: 2020

1
2020-2022

Agreement vs. Association


Focus on the kappa statistic at first
During the module HME 711 you were introduced to measures of association
In this session we will learn about how to calculate two important measures of agreement

As an example: Height and Weight are associated (as height increases, weight also increases);
However they are not equal.
120

The mean body mass was 65Kg;


100
Body mass in Kg

while the mean height was 165cm


80
60

Clearly body mass ≠ height


40

150 160 170


Height in m
180 190
but they are “associated” as they vary together

HME 712 Week 1: 2020

Examples of agreement being important


We would want to be sure that these sets of data agree

1. Two different laboratories use different methods to measure a biomarker. Are the methods comparable/ do they agree?

2. Two different clinicians examine the same set of chest X-rays to diagnose tuberculosis. Do they agree in their diagnoses?

3. A laboratory analyst measures blood lead twice using the same method: does he get the same results for both readings?

4. Study participants are asked to fill out a Likert questionnaire two weeks apart; do they provide the same answers?

Items 1. And 3. involve numerical data agreement while 2. and 4. involve categorical data agreement.

We need different methods to measure agreement for numerical and categorical data sets.

HME 712 Week 1: 2020

2
2020-2022

Association measures vs. Agreement measures


The table is not exhaustive; we show just a few measures that we will use

Data type ………….


Categorical data Numerical data
Association measures: Odds ratios Risk ratios etc…. Pearson’s correlation coefficient;
Spearman’s (rank) correlation coefficient

Agreement measures: Kappa statistic; The methods of Bland and Altman (especially graphical
Weighted Kappa statistic approach)

This slide show is about the Kappa statistics and the Bland and Altman methods

HME 712 Week 1: 2020

The Kappa statistic


The concept
Consider a scenario for this discussion. Two medical doctors each examine a set of 20 chest X-rays
They must decide if the X-ray shows presence of tuberculosis (TB) or not
They each diagnose “blind” to the other doctor’s diagnosis. How well do they agree?

The problem is that even if the doctors were blindfolded they would agree on some X-rays just by sheer luck; “by chance”

There are 20 X-rays, hence 20 opportunities to “agree”. If the doctors will agree on 3 of these X-rays even if they were
blindfolded, then there are (20-3) opportunities to agree beyond chance or beyond guessing. So now they have 17
opportunities to agree “beyond chance”.

If they agree in the diagnosis for 12 of these 17 X-rays then the % agreement beyond chance is (100*12/17) = 70.59%. The
Kappa statistic is then 0.7059
The Kappa statistic is the proportion of agreement beyond chance

HME 712 Week 1: 2020

3
2020-2022

An example of kappa
A simple calculation Observation 2
Positive (+) Negative (-)
Positive (+) 30 15 45
Observation 1
Negative (-) 10 45 55
40 60 100

1. Opportunities for agreement = 100


2. Proportion agreement (“Concordance”) = (30+45)/100 = 75/100 = 0.75 (or 75%)
3. If guessing they would agree on 51 (accept this for now please; we will explain later in the third slide show for this week)
4. Thus there are 100-51 opportunities left over to agree “beyond chance”
5. In total they agree on 75 diagnoses (this includes the 51 from guessing)
6. So Kappa = (75-51)/(100-51) = 24/49 = 0.49

HME 712 Week 1: 2020

The interpretation of Kappa statistic values


What Kappa value indicates “good” agreement?

Byrt* has proposed the following interpretation of Kappa statistics for clinical settings:

0.93 – 1.00 Excellent agreement


0.81 – 0.92 Very good agreement
0.61 – 0.80 Good agreement
0.41 – 0.60 Fair agreement Using Byrt’s criteria,
0.21 – 0.40 Slight agreement the previous slide
0.01 – 0.20 Poor agreement Kappa = 0.49 suggests
 0.00 No agreement “Fair” agreement

* Byrt T. 1996. How good is that agreement? Epidemiol 7:561

HME 712 Week 1: 2020

4
2020-2022

Thank You

You might also like