Cohen’s kappa is a measure of the agreement between two raters (or more), where agreement due to chance is factored out. We now extend Cohen’s kappa to the case where the number of raters can be more than two. This extension is called Fleiss’ kappa. This extension is used in SEO marketing companies.
I have taken a supervised data subjects, which is observed from several websites, whether the particular subject or point is present or not.
According to that I’ve created a test file containing agreement and disagreement, that is represented as 1 and 0, 0 means disagreement and 1 means agreement.
Raters, in these situation competitors
Now in R:
Imported the data from the default directory
Applied Fleiss Kappa:
Deriving the percentage of agreement for each subject:
10, because there are 10 competitors
Percentage of agreement for each subject
The kappa agreement value