Hello everyone! I’m running a nominal logistic Reg Model (JMP v13) which has 8 independent variables. I am concerned about collinearity and confounding. How can I check the existence of these two issues? Can you evaluate multi-collinearity with Variation inflation factor in JMP? (kindly, how?) Thanks.
Under Help==>Statistics Index==>multicollinearity points to the Variation Inflation Factors in the Fit Model Platform, for the determination of the existence of multicollinearity.
"In regression where the regressors are highly correlated, a measure of interest is how much inflated the variance of the estimator compared with what its variance would be without the effect of the other regressors. In Fit Model the VIF is available by context-clicking in the Prameter Estimates report table."
A couple suggestions for you:
1. Take full advantage of version 13's improved experimental design diagnostic capabilities by taking your matrix of predictor variables and 'pretend' if indeed it wasn't, a designed experiment, and use the DOE -> Design Diagnostics -> Evaluate Designs platform. Within that platform's report you'll get all sorts of diagnostics including confounding and correlations for your predictor variables based on the predictor variable matrix. The platform doesn't require that your predictor variable matrix is indeed a 'designed experiment'. One key feature of this platform from a confounding perspective is you can specify the exact model form (main effects, 2 way interactions, etc.) you'd like to estimate.
2. Use the Multivariate platform to explore correlations (pairwise and more complex using, say principle components analysis) among the matrix of predictor variables.
I assume that you are using the Analyze > Fit Model launch with the Nominal Logistic personality. My example from the Big Class sample data table fits sex = intercept + weight + height + weight * height. The result is:
Tthe Covariance of Estimates is available. The diagonal is the variance of the estimates and the off-diagonal are the covariances, which would be zero if there were no collinearity.
Notice that the predictors are automatically centered to minimize the correlation of the estimates.
Thanks. All the independent variables used in your example are continuous. Can we expect this result to hold true when the model uses also (or only) categorical variables? Kindly tell me a bit more of your explanation "Notice that the predictors are automatically centered to minimize the correlation of the estimates". Which predictors? and 'centered' in relation to what?