Cohen’s Kappa is a measure of the degree of agreement between two or more raters in the context of categorical data. To calculate Cohen’s Kappa with R, you can use the “irr” package which contains functions to calculate Cohen’s Kappa and other measures of inter-rater agreement. The “irr” package is available on the Comprehensive R Archive Network (CRAN) and can be installed with the command install.packages(“irr”). Once the “irr” package is installed, you can use the kappa2() function to calculate Cohen’s Kappa.
@import url(‘https://fonts.googleapis.com/css?family=Droid+Serif|Raleway’);
.axis–y .domain {
display: none;
}
h1 {
color: black;
text-align: center;
margin-top: 15px;
margin-bottom: 0px;
font-family: ‘Raleway’, sans-serif;
}
h2 {
color: black;
font-size: 20px;
text-align: center;
margin-bottom: 15px;
margin-top: 15px;
font-family: ‘Raleway’, sans-serif;
}
p {
color: black;
text-align: center;
margin-bottom: 15px;
margin-top: 15px;
font-family: ‘Raleway’, sans-serif;
}
#words_intro {
color: black;
font-family: Raleway;
max-width: 550px;
margin: 25px auto;
line-height: 1.75;
}
#words_intro_center {
text-align: center;
color: black;
font-family: Raleway;
max-width: 550px;
margin: 25px auto;
line-height: 1.75;
}
#words_outro {
color: black;
font-family: Raleway;
max-width: 550px;
margin: 25px auto;
line-height: 1.75;
}
#words {
color: black;
font-family: Raleway;
max-width: 550px;
margin: 25px auto;
line-height: 1.75;
padding-left: 100px;
}
#calcTitle {
text-align: center;
font-size: 20px;
margin-bottom: 0px;
font-family: ‘Raleway’, serif;
}
#hr_top {
width: 30%;
margin-bottom: 0px;
margin-top: 10px;
border: none;
height: 2px;
color: black;
background-color: black;
}
#hr_bottom {
width: 30%;
margin-top: 15px;
border: none;
height: 2px;
color: black;
background-color: black;
}
.input_label_calc {
display: inline-block;
vertical-align: baseline;
width: 350px;
}
#button_calc {
border: 1px solid;
border-radius: 10px;
margin-top: 20px;
padding: 10px 10px;
cursor: pointer;
outline: none;
background-color: white;
color: black;
font-family: ‘Work Sans’, sans-serif;
border: 1px solid grey;
/* Green */
}
#button_calc:hover {
background-color: #f6f6f6;
border: 1px solid black;
}
.label_radio {
text-align: center;
}
- po: Relative observed agreement among raters
- pe: Hypothetical probability of chance agreement
Cohen’s Kappa: 0.2857
function calc() {
//get input values
var bothyes = document.getElementById(‘bothyes’).value*1;
var bothno = document.getElementById(‘bothno’).value*1;
var yes1 = document.getElementById(‘yes1’).value*1;
var yes2 = document.getElementById(‘yes2’).value*1;
//calculate stuff
var n = bothyes-(-1*bothno)-(-1*yes1)-(-1*yes2)
var po = (bothyes-(-1*bothno))/n;
var pe_1 = ((bothyes-(-1*yes1))/n) * ((bothyes-(-1*yes2))/n);
var pe_2 = ((n-yes1-bothyes)/n)*((n-yes2-bothyes)/n);
var pe = pe_1 – (-1*pe_2);
var k = (po-pe)/(1-pe);
//output
document.getElementById(‘k’).innerHTML = k.toFixed(4);
}