Okay, so I found where to find the Kappa Coefficient; it's under the "Agreement Statistic" table. The "Agreement Statistic" table only appears when both X and Y variables have the same levels. What does that mean? Is there a way to check for this?
It's really just "apples to apples" or "apples to oranges." Also, it is the difference between continuous and categorical data.
If you have a classification of intelligence and another classification of monetary assets, you can use Contingency to see if these variables are associated. (Analogous to correlated with continuous variables.) On the other hand if you had two classifications of intelligence, you could assess the association but you could explore a stricter relationship: agreement. What is the proportion of times both classifications agreed?
We measure the strength of association with the odds ratio. We measure the strength of agreement with Kappa.