If you have multiple reviewers, calculate the percentage agreement as follows: With this tool, you can easily calculate the degree of match between two judges during the selection of studies to be included in a meta-analysis. Fill the fields to get the gross percentage of the chord and the value of Cohens Kappa. Kappa is always smaller or equal to 1. A value of 1 implies a perfect match and values below 1 mean less than a perfect match. Multiply the quotient value by 100 to get the percentage parity for the equation. You can also move the decimal place to the right two places, which offers the same value as multiplying by 100. Multiply z.B 0.5 per 100 to get a total agreement of 50 percent. Step 3: For each pair, put a “1” for the chord and “0” for the chord. For example, participant 4, Judge 1/Judge 2 disagrees (0), Judge 1/Judge 3 disagrees (0) and Judge 2 /Judge 3 agreed (1). To interpret your Cohen`s kappa results, you can refer to the following guidelines (see Landis, JR-Koch, GG (1977). The measure of the compliance agreement for categorical data.

Biometrics, 33, 159-174): For the calculation of the percentage agreement, you must determine the percentage of the difference between two digits. This value can be useful if you want to show the difference between two percentage numbers. Scientists can use the two-digit percentage agreement to show the percentage of the relationship between the different results. When calculating the percentage difference, you have to take the difference in values, divide it by the average of the two values, and then multiply that number of times 100. A serious error in this type of reliability between boards is that the random agreement does not take into account and overestimates the level of agreement. This is the main reason why the percentage of consent should not be used for scientific work (i.e. doctoral theses or scientific publications). The field in which you work determines the acceptable level of agreement. If it is a sporting competition, you can accept a 60% agreement to nominate a winner. However, if you look at the data from oncologists who choose to take a treatment, you need a much higher agreement – more than 90%.

In general, more than 75% are considered acceptable in most areas. The Cohen-Kappa is a statistical coefficient that represents the degree of accuracy and reliability in a statistical classification. It measures the agreement between two councillors (judges) who, by their purpose, classify each of the categories that are mutually exclusive.