How to Calculate Cohen’s Kappa in Excel

Cohen’s Kappa is a statistic that measures inter-rater reliability for categorical items. It can be calculated in Excel using the KAPPA.S function, which requires the input of the number of observations, the number of concordant pairs, and the number of discordant pairs. The output of this function is the Cohen’s Kappa statistic, which can be interpreted to indicate the strength of the inter-rater reliability.


Cohen’s Kappa is used to measure the level of agreement between two raters or judges who each classify items into mutually exclusive categories.

The formula for Cohen’s kappa is calculated as:

k = (po – pe) / (1 – pe)

where:

  • po: Relative observed agreement among raters
  • pe: Hypothetical probability of chance agreement

Rather than just calculating the percentage of items that the raters agree on, Cohen’s Kappa attempts to account for the fact that the raters may happen to agree on some items purely by chance.

The value for Cohen’s Kappa always ranges between 0 and 1, with 0 indicating no agreement between the two raters and 1 indicating perfect agreement between the two raters.

The following table summarizes how to interpret different values for Cohen’s Kappa:

Cohen's Kappa

The following example shows how to calculate Cohen’s Kappa in Excel.

Example: Calculating Cohen’s Kappa in Excel

Suppose two art museum curators are asked to rate 70 paintings on whether they’re good enough to be shown in a new exhibit.

The following 2×2 table shows the results of the ratings:

The following screenshot shows how to calculate Cohen’s Kappa for the two raters, including the formulas used:

Cohen's Kappa in Excel

The p0 value represents the relative agreement between the raters. This is the proportion of total ratings that the raters both said “Yes” or both said “No” on. 

The pe value represents the probability that the raters could have agreed purely by chance. 

This turns out to be 0.5.

The k value represents Cohen’s Kappa, which is calculated as:

  • k = (po – pe) / (1 – pe)
  • k = (0.6429 – 0.5) / (1 – 0.5)
  • k = 0.2857

Cohen’s Kappa turns out to be 0.2857.

Based on the table from earlier, we would say that the two raters only had a “fair” level of agreement.

The following tutorials offer additional resources on Cohen’s Kappa:

x