cancel
Showing results for 
Show  only  | Search instead for 
Did you mean: 
Try the Materials Informatics Toolkit, which is designed to easily handle SMILES data. This and other helpful add-ins are available in the JMP® Marketplace
Choose Language Hide Translation Bar
JobbeGoossens
Level II

Differing solutions when manually coding my interaction effects versus adding them through the cross option in the fitting models dialogue-box.

Dear potential saviors

 

Recently I've had some inconsistencies in JMP depending on how I exactly tackle the problem. When I manually code the interaction of a numerical variable and a dummy variable (i.e. I multiply them), I do not get the same solution as when I just specify in the fit model box that I want to cross these effects. Moreover, when I did the same in R, I do not seem the same behavior and the two results are identical, thus leading me to suspect that it was not the manual coding of the variable that went wrong but rather that it is a peculiarity related to JMP ( I'm using a package that uses the lme4 package or the robust variant thereof, none of them show this weird behavior.). 

 

Thanks in advance for your help and answers

 

Jobbe 

6 REPLIES 6

Re: Differing solutions when manually coding my interaction effects versus adding them through the cross option in the fitting models dialogue-box.

Please see this explanation of the factor models used in the Fit Least Squares platform. There is more than one way to code regressor levels.

JobbeGoossens
Level II

Re: Differing solutions when manually coding my interaction effects versus adding them through the cross option in the fitting models dialogue-box.

Thanks for providing the link. After some digging I found the solution. The reason is the automatic centering of the variables prior to combining them for an interaction. When I disable this option, the solution is the same again. However, it still makes me wonder how it should be interpreted. The p-values are not even close ( it goes from being clearly significant to not significant at all). What is the reason it is centered as a standard option and should the p-values be considered as concerning another hypothesis and if so, which then? In my mind, a linear transformation such as substracting the mean should not have this effect? 

JobbeGoossens
Level II

Re: Differing solutions when manually coding my interaction effects versus adding them through the cross option in the fitting models dialogue-box.

Thanks for providing the link. After some digging I found the solution. The reason is the automatic centering of the variables prior to combining them for an interaction. When I disable this option, the solution is the same again. However, it still makes me wonder how it should be interpreted. The p-values are not even close ( it goes from being clearly significant to not significant at all). What is the reason it is centered as a standard option and should the p-values be considered as concerning another hypothesis and if so, which then? In my mind, a linear transformation such as substracting the mean should not have this effect? 

Victor_G
Super User

Re: Differing solutions when manually coding my interaction effects versus adding them through the cross option in the fitting models dialogue-box.

Hi @JobbeGoossens,

JMP is doing an automatic centering of variables when there are interactions to avoid multicollinearity issues between interaction term and variables terms involved in the interaction.

You can see the comparative example I did for a similar topic in the Community :
https://community.jmp.com/t5/Discussions/Stepwise-model-question/m-p/591848/highlight/true#M79593

Automatic centering is actually the safest way to evaluate interactions and calculate parameter estimates, as you reduce multicollinearity with main variables involved in the interaction, so you reduce the variance for interactions parameters estimates, resulting in more reliable and accurate significance testing.

I hope this answer will help you,
Victor GUILLER

"It is not unusual for a well-designed experiment to analyze itself" (Box, Hunter and Hunter)
statman
Super User

Re: Differing solutions when manually coding my interaction effects versus adding them through the cross option in the fitting models dialogue-box.

As Victor says...Here is the JMP reason:

 

https://www.jmp.com/support/notes/37/925.html

 

"All models are wrong, some are useful" G.E.P. Box
julian
Community Manager Community Manager

Re: Differing solutions when manually coding my interaction effects versus adding them through the cross option in the fitting models dialogue-box.

Hi @JobbeGoossens,

It makes a lot of sense why this result would seem mysterious, and it's a question that comes up a lot, not just in the context of jmp. In case it helps, I want to link you to another discussion where I posted a video and explanation  that demonstrates how centering polynomials affects the interpretation of the lower order terms, which should help explain why the p-values are so different (they're testing something different). 

 

https://community.jmp.com/t5/Discussions/estimates-in-multipule-regression/m-p/10965/highlight/true#...

 

I hope this helps!

@julian