cancel
Showing results for 
Show  only  | Search instead for 
Did you mean: 
The Discovery Summit 2025 Call for Content is open! Submit an abstract today to present at our premier analytics conference.
Choose Language Hide Translation Bar
View Original Published Thread

Kappa Coefficient and Agreement Statistic

barik
Level I

Hi all,

Assume that I have two columns, Bob and Mary, who rate four items with either a Y or a N. In the following case, no Agreement Statistic is present when doing Fit Y by X, because it appears that JMP thinks that Bob and Mary do not have the same categorical levels. How can I set column "Bob" to inform JMP that Bob can also take on values or Y or N? For example, if I change one of the columns in Bob to N, the Agreement Statistic feature is available again.

6287_2014-05-12_20-32-34.png

I feel like this should be specified in Column Properties somewhere, but I can't figure out how to make it work.

Thanks!

1 ACCEPTED SOLUTION

Accepted Solutions
Jeff_Perkinson
Community Manager Community Manager

Re: Kappa Coefficient and Agreement Statistic

Hi Titus,

Try adding one more row to your data table with the N for Bob (either Y or N for Mary) and also add a Freq column with a 1 in every row except the new row – with the N in it for Bob – which should have a 0.

6289_BobMaryFreq.png

Use the Freq column in the Freq role in Fit Y by X and you should get the option for the Agreement Statistic.

6290_BobMaryFreqFitYbyX.png

6291_FitYByX.png

-Jeff

-Jeff

View solution in original post

2 REPLIES 2
Jeff_Perkinson
Community Manager Community Manager

Re: Kappa Coefficient and Agreement Statistic

Hi Titus,

Try adding one more row to your data table with the N for Bob (either Y or N for Mary) and also add a Freq column with a 1 in every row except the new row – with the N in it for Bob – which should have a 0.

6289_BobMaryFreq.png

Use the Freq column in the Freq role in Fit Y by X and you should get the option for the Agreement Statistic.

6290_BobMaryFreqFitYbyX.png

6291_FitYByX.png

-Jeff

-Jeff
pgstats
Level I


Re: Kappa Coefficient and Agreement Statistic

When one of the raters always says the same thing (Bob, in this case) Kappa is zero, by definition. You can show this by adding a weight variable and a dummy observation where Bob says N but with a weight of zero.

6293_snip.PNG

PG