Fleiss Kappa: The definitive Guide

kohens kappa

Cohen’s kappa is a measure of the agreement between two raters (or more), where agreement due to chance is factored out. We now extend Cohen’s kappa to the case where the number of raters can be more than two. This extension is called Fleiss’ kappa.  This extension is used in SEO marketing companies.

Source [ ]

I have taken a supervised data subjects, which is observed from several websites, whether the particular subject or point is present or not.

According to that I’ve created a test file containing agreement and disagreement, that is represented as 1 and 0, 0 means disagreement and 1 means agreement. A seo company can make you know more about Fleiss Kappa.

kappa 1


Raters, in these situation competitors

Now in R:

Imported the data from the default directory

Imported Libraries:

kappa 3

Applied Fleiss Kappa:

kappa 4


kappa 5


Deriving the percentage of agreement for each subject:

kappa 6

10, because there are 10 competitors


kappa 7

Result file:

kappa 8

Percentage of the agreement for each subject

The kappa agreement value


Leave a Reply

Your email address will not be published. Required fields are marked *