How can Fleiss’ Kappa be calculated using Excel?

Fleiss’ Kappa is a statistical measure used to assess the agreement between multiple raters on a categorical variable. It is commonly used in fields such as psychology, sociology, and medicine. Excel can be used to calculate Fleiss’ Kappa by following these steps:

1. Create a table in Excel, with each row representing a subject and each column representing a rater.
2. Assign numerical codes to the categories being rated (e.g. 1 for “agree”, 2 for “disagree”, etc.).
3. Enter the ratings for each rater in the corresponding cells of the table.
4. Calculate the total number of raters (n) and the total number of categories (k).
5. Calculate the overall proportion of agreement (P) by summing the number of ratings in each category and dividing by the total number of ratings.
6. Calculate the proportion of agreement that would be expected by chance (Pe) by summing the squared proportions for each category and dividing by the number of raters.
7. Calculate Fleiss’ Kappa (K) using the formula: K = (P – Pe) / (1 – Pe)
8. The resulting value of K will range from -1 to 1, with 1 indicating perfect agreement and 0 indicating no agreement beyond chance.

Calculate Fleiss’ Kappa in Excel


Fleiss’ Kappa is a way to measure the degree of agreement between three or more raters when the raters are assigning categorical ratings to a set of items.

Fleiss’ Kappa ranges from 0 to 1 where:

  • indicates no agreement at all among the raters.
  • indicates perfect inter-rater agreement.

This tutorial provides an example of how to calculate Fleiss’ Kappa in Excel.

Example: Fleiss’ Kappa in Excel

Suppose 14 individuals rate 10 different products on a scale of Poor to Excellent.

The following screenshot displays the total ratings that each product received:

The following screenshot shows how to calculate Fleiss’ Kappa for this data in Excel:

Fleiss' Kappa calculation in Excel

The trickiest calculations in this screenshot are in column J. The formula used for these calculations is shown in the text box near the top of the screen.

Note that the Fleiss’ Kappa in this example turns out to be 0.2099. The actual formula used to calculate this value in cell C18 is:

Fleiss’ Kappa = (0.37802 – 0.2128) / (1 –  0.2128) = 0.2099.

Although there is no formal way to interpret Fleiss’ Kappa, the following values show how to interpret Cohen’s Kappa, which is used to assess the level of inter-rater agreement between just two raters:

  • < 0.20 | Poor
  • .21 – .40 | Fair
  • .41 – .60 | Moderate
  • .61 – .80 | Good
  • .81 – 1 | Very Good

Based on these values, Fleiss’ Kappa of 0.2099 in our example would be interpreted as a “fair” level of inter-rater agreement.

x